Merge "Enforce ref only for gerrit events that supply a ref"
diff --git a/README.rst b/README.rst
index 1b227e7..ff4d938 100644
--- a/README.rst
+++ b/README.rst
@@ -9,7 +9,7 @@
 To browse the latest code, see: https://git.openstack.org/cgit/openstack-infra/zuul/tree/
 To clone the latest code, use `git clone git://git.openstack.org/openstack-infra/zuul`
 
-Bugs are handled at: https://launchpad.net/zuul
+Bugs are handled at: https://storyboard.openstack.org/#!/project/679
 
 Code reviews are, as you might expect, handled by gerrit. The gerrit they
 use is http://review.openstack.org
diff --git a/doc/source/cloner.rst b/doc/source/cloner.rst
index bb33f82..2ddf0b5 100644
--- a/doc/source/cloner.rst
+++ b/doc/source/cloner.rst
@@ -75,3 +75,15 @@
 projects::
 
   zuul-cloner project project/plugins/plugin1
+
+Cached repositories
+-------------------
+
+The ``--cache-dir`` option can be used to reduce network traffic by
+cloning from a local repository which may not be up to date.
+
+If the ``--cache-dir`` option is supplied, zuul-cloner will start by
+cloning any projects it processes from those found in that directory.
+The URL of origin remote of the resulting clone will be reset to use
+the ``git_base_url`` and then the remote will be updated so that the
+repository has all the information in the upstream repository.
diff --git a/doc/source/triggers.rst b/doc/source/triggers.rst
index c4485bf..dd650f2 100644
--- a/doc/source/triggers.rst
+++ b/doc/source/triggers.rst
@@ -4,8 +4,7 @@
 ========
 
 The process of merging a change starts with proposing a change to be
-merged.  Primarily, Zuul supports Gerrit as a triggering system, as
-well as a facility for triggering jobs based on a timer.
+merged.  Primarily, Zuul supports Gerrit as a triggering system.
 Zuul's design is modular, so alternate triggering and reporting
 systems can be supported.
 
@@ -40,3 +39,8 @@
 
 A simple timer trigger is available as well.  It supports triggering
 jobs in a pipeline based on cron-style time instructions.
+
+Zuul
+----
+
+The Zuul trigger generates events based on internal actions in Zuul.
diff --git a/doc/source/zuul.rst b/doc/source/zuul.rst
index cdab4b7..6cb5d59 100644
--- a/doc/source/zuul.rst
+++ b/doc/source/zuul.rst
@@ -36,7 +36,7 @@
 The three sections of this config and their options are documented below.
 You can also find an example zuul.conf file in the git
 `repository
-<https://github.com/openstack-infra/zuul/blob/master/etc/zuul.conf-sample>`_
+<https://git.openstack.org/cgit/openstack-infra/zuul/tree/etc/zuul.conf-sample>`_
 
 gearman
 """""""
@@ -295,6 +295,7 @@
 
   - name: check
     manager: IndependentPipelineManager
+    source: gerrit
     trigger:
       gerrit:
         - event: patchset-created
@@ -311,6 +312,11 @@
   This is an optional field that may be used to provide a textual
   description of the pipeline.
 
+**source**
+  A required field that specifies a trigger that provides access to
+  the change objects that this pipeline operates on.  Currently only
+  the value ``gerrit`` is supported.
+
 **success-message**
   An optional field that supplies the introductory text in message
   reported back to Gerrit when all the voting builds are successful.
@@ -383,7 +389,7 @@
     DependentPipelineManager, see: :doc:`gating`.
 
 **trigger**
-  Exactly one trigger source must be supplied for each pipeline.
+  At least one trigger source must be supplied for each pipeline.
   Triggers are not exclusive -- matching events may be placed in
   multiple pipelines, and they will behave independently in each of
   the pipelines they match.  You may select from the following:
@@ -469,6 +475,31 @@
     supported, not the symbolic names.  Example: ``0 0 * * *`` runs
     at midnight.
 
+  **zuul**
+    This trigger supplies events generated internally by Zuul.
+    Multiple events may be listed.
+
+    *event*
+    The event name.  Currently supported:
+
+      *project-change-merged* when Zuul merges a change to a project,
+      it generates this event for every open change in the project.
+
+      *parent-change-enqueued* when Zuul enqueues a change into any
+      pipeline, it generates this event for every child of that
+      change.
+
+    *pipeline*
+    Only available for ``parent-change-enqueued`` events.  This is the
+    name of the pipeline in which the parent change was enqueued.
+
+    *require-approval*
+    This may be used for any event.  It requires that a certain kind
+    of approval be present for the current patchset of the change (the
+    approval could be added by the event in question).  It follows the
+    same syntax as the "approval" pipeline requirement below.
+
+
 **require**
   If this section is present, it established pre-requisites for any
   kind of item entering the Pipeline.  Regardless of how the item is
@@ -893,7 +924,7 @@
 that all changes be processed by a pipeline but a project has no jobs
 that can be run on it.
 
-.. seealso:: The OpenStack Zuul configuration for a comprehensive example: https://github.com/openstack-infra/config/blob/master/modules/openstack_project/files/zuul/layout.yaml
+.. seealso:: The OpenStack Zuul configuration for a comprehensive example: https://git.openstack.org/cgit/openstack-infra/project-config/tree/zuul/layout.yaml
 
 Project Templates
 """""""""""""""""
@@ -947,6 +978,10 @@
      check:
       - foobar-extra-special-job
 
+Individual jobs may optionally be added to pipelines (e.g. check,
+gate, et cetera) for a project, in addtion to those provided by
+templates.
+
 The order of the jobs listed in the project (which only affects the
 order of jobs listed on the report) will be the jobs from each
 template in the order listed, followed by any jobs individually listed
diff --git a/etc/status/public_html/jquery.zuul.js b/etc/status/public_html/jquery.zuul.js
index 01f8bd3..5d155af 100644
--- a/etc/status/public_html/jquery.zuul.js
+++ b/etc/status/public_html/jquery.zuul.js
@@ -50,7 +50,7 @@
         }, options);
 
         var collapsed_exceptions = [];
-        var current_filter = read_cookie('zuul_filter_string', current_filter);
+        var current_filter = read_cookie('zuul_filter_string', '');
         var $jq;
 
         var xhr,
@@ -111,6 +111,7 @@
                     );
                 }
 
+                $job_line.append($('<div style="clear: both"></div>'));
                 return $job_line;
             },
 
@@ -266,9 +267,21 @@
 
                 var $change_link = $('<small />');
                 if (change.url !== null) {
-                    $change_link.append(
-                        $('<a />').attr('href', change.url).text(change.id)
-                    );
+                    if (/^[0-9a-f]{40}$/.test(change.id)) {
+                        var change_id_short = change.id.slice(0, 7);
+                        $change_link.append(
+                            $('<a />').attr('href', change.url).append(
+                                $('<abbr />')
+                                    .attr('title', change.id)
+                                    .text(change_id_short)
+                            )
+                        );
+                    }
+                    else {
+                        $change_link.append(
+                            $('<a />').attr('href', change.url).text(change.id)
+                        );
+                    }
                 }
                 else {
                     $change_link.text(change_id);
diff --git a/requirements.txt b/requirements.txt
index eabcef3..dd947d6 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -5,7 +5,7 @@
 Paste
 WebOb>=1.2.3,<1.3
 paramiko>=1.8.0
-GitPython>=0.3.2.RC1
+GitPython==0.3.2.RC1
 lockfile>=0.8
 ordereddict
 python-daemon
@@ -14,8 +14,6 @@
 voluptuous>=0.7
 gear>=0.5.4,<1.0.0
 apscheduler>=2.1.1,<3.0
-python-swiftclient>=1.6
-python-keystoneclient>=0.4.2
 PrettyTable>=0.6,<0.8
 babel>=1.0
 six>=1.6.0
diff --git a/test-requirements.txt b/test-requirements.txt
index 99ada89..5192de7 100644
--- a/test-requirements.txt
+++ b/test-requirements.txt
@@ -6,7 +6,9 @@
 docutils==0.9.1
 discover
 fixtures>=0.3.14
+python-keystoneclient>=0.4.2
 python-subunit
+python-swiftclient>=1.6
 testrepository>=0.0.17
 testtools>=0.9.32
 sphinxcontrib-programoutput
diff --git a/tests/base.py b/tests/base.py
index a86de82..46c7087 100755
--- a/tests/base.py
+++ b/tests/base.py
@@ -52,9 +52,11 @@
 import zuul.reporter.smtp
 import zuul.trigger.gerrit
 import zuul.trigger.timer
+import zuul.trigger.zuultrigger
 
 FIXTURE_DIR = os.path.join(os.path.dirname(__file__),
                            'fixtures')
+USE_TEMPDIR = True
 
 logging.basicConfig(level=logging.DEBUG,
                     format='%(asctime)s %(name)-32s '
@@ -164,7 +166,7 @@
         if files:
             fn = files[0]
         else:
-            fn = '%s-%s' % (self.branch, self.number)
+            fn = '%s-%s' % (self.branch.replace('/', '_'), self.number)
         msg = self.subject + '-' + str(self.latest_patchset)
         c = self.add_fake_change_to_repo(msg, fn, large)
         ps_files = [{'file': '/COMMIT_MSG',
@@ -368,6 +370,7 @@
         self.fixture_dir = os.path.join(FIXTURE_DIR, 'gerrit')
         self.change_number = 0
         self.changes = {}
+        self.queries = []
 
     def addFakeChange(self, project, branch, subject, status='NEW'):
         self.change_number += 1
@@ -401,6 +404,14 @@
             return change.query()
         return {}
 
+    def simpleQuery(self, query):
+        # This is currently only used to return all open changes for a
+        # project
+        self.queries.append(query)
+        l = [change.query() for change in self.changes.values()]
+        l.append({"type":"stats","rowCount":1,"runTimeMilliseconds":3})
+        return l
+
     def startWatching(self, *args, **kw):
         pass
 
@@ -811,8 +822,11 @@
                 level=logging.DEBUG,
                 format='%(asctime)s %(name)-32s '
                 '%(levelname)-8s %(message)s'))
-        tmp_root = self.useFixture(fixtures.TempDir(
-            rootdir=os.environ.get("ZUUL_TEST_ROOT"))).path
+        if USE_TEMPDIR:
+            tmp_root = self.useFixture(fixtures.TempDir(
+                    rootdir=os.environ.get("ZUUL_TEST_ROOT"))).path
+        else:
+            tmp_root = os.environ.get("ZUUL_TEST_ROOT")
         self.test_root = os.path.join(tmp_root, "zuul-test")
         self.upstream_root = os.path.join(self.test_root, "upstream")
         self.git_root = os.path.join(self.test_root, "git")
@@ -834,6 +848,9 @@
         self.init_repo("org/project1")
         self.init_repo("org/project2")
         self.init_repo("org/project3")
+        self.init_repo("org/project4")
+        self.init_repo("org/project5")
+        self.init_repo("org/project6")
         self.init_repo("org/one-job-project")
         self.init_repo("org/nonvoting-project")
         self.init_repo("org/templated-project")
@@ -906,6 +923,8 @@
         self.sched.registerTrigger(self.gerrit)
         self.timer = zuul.trigger.timer.Timer(self.config, self.sched)
         self.sched.registerTrigger(self.timer)
+        self.zuultrigger = zuul.trigger.zuultrigger.ZuulTrigger(self.config, self.sched)
+        self.sched.registerTrigger(self.zuultrigger)
 
         self.sched.registerReporter(
             zuul.reporter.gerrit.Reporter(self.gerrit))
@@ -934,9 +953,6 @@
         self.config.read(os.path.join(FIXTURE_DIR, "zuul.conf"))
 
     def assertFinalState(self):
-        # Make sure that the change cache is cleared
-        self.assertEqual(len(self.gerrit._change_cache.keys()), 0,
-                         "Change cache should have been cleared")
         # Make sure that git.Repo objects have been garbage collected.
         repos = []
         gc.collect()
@@ -967,7 +983,6 @@
         threads = threading.enumerate()
         if len(threads) > 1:
             self.log.error("More than one thread is running: %s" % threads)
-        super(ZuulTestCase, self).tearDown()
 
     def init_repo(self, project):
         parts = project.split('/')
@@ -990,15 +1005,26 @@
         master = repo.create_head('master')
         repo.create_tag('init')
 
-        mp = repo.create_head('mp')
-        repo.head.reference = mp
+        repo.head.reference = master
+        repo.head.reset(index=True, working_tree=True)
+        repo.git.clean('-x', '-f', '-d')
+
+        self.create_branch(project, 'mp')
+
+    def create_branch(self, project, branch):
+        path = os.path.join(self.upstream_root, project)
+        repo = git.Repo.init(path)
+        fn = os.path.join(path, 'README')
+
+        branch_head = repo.create_head(branch)
+        repo.head.reference = branch_head
         f = open(fn, 'a')
-        f.write("test mp\n")
+        f.write("test %s\n" % branch)
         f.close()
         repo.index.add([fn])
-        repo.index.commit('mp commit')
+        repo.index.commit('%s commit' % branch)
 
-        repo.head.reference = master
+        repo.head.reference = repo.heads['master']
         repo.head.reset(index=True, working_tree=True)
         repo.git.clean('-x', '-f', '-d')
 
diff --git a/tests/fixtures/layout-gating.yaml b/tests/fixtures/layout-cloner.yaml
similarity index 65%
rename from tests/fixtures/layout-gating.yaml
rename to tests/fixtures/layout-cloner.yaml
index a544a80..e840ed9 100644
--- a/tests/fixtures/layout-gating.yaml
+++ b/tests/fixtures/layout-cloner.yaml
@@ -22,8 +22,24 @@
 
   - name: org/project1
     gate:
-        - project1-project2-integration
+        - integration
 
   - name: org/project2
     gate:
-        - project1-project2-integration
+        - integration
+
+  - name: org/project3
+    gate:
+        - integration
+
+  - name: org/project4
+    gate:
+        - integration
+
+  - name: org/project5
+    gate:
+        - integration
+
+  - name: org/project6
+    gate:
+        - integration
diff --git a/tests/fixtures/layout-zuultrigger-enqueued.yaml b/tests/fixtures/layout-zuultrigger-enqueued.yaml
new file mode 100644
index 0000000..8babd9e
--- /dev/null
+++ b/tests/fixtures/layout-zuultrigger-enqueued.yaml
@@ -0,0 +1,53 @@
+pipelines:
+  - name: check
+    manager: IndependentPipelineManager
+    source: gerrit
+    require:
+      approval:
+        - verified: -1
+    trigger:
+      gerrit:
+        - event: patchset-created
+      zuul:
+        - event: parent-change-enqueued
+          pipeline: gate
+    success:
+      gerrit:
+        verified: 1
+    failure:
+      gerrit:
+        verified: -1
+
+  - name: gate
+    manager: DependentPipelineManager
+    failure-message: Build failed.  For information on how to proceed, see http://wiki.example.org/Test_Failures
+    source: gerrit
+    require:
+      approval:
+        - verified: 1
+    trigger:
+      gerrit:
+        - event: comment-added
+          approval:
+            - approved: 1
+      zuul:
+        - event: parent-change-enqueued
+          pipeline: gate
+    success:
+      gerrit:
+        verified: 2
+        submit: true
+    failure:
+      gerrit:
+        verified: -2
+    start:
+      gerrit:
+        verified: 0
+    precedence: high
+
+projects:
+  - name: org/project
+    check:
+      - project-check
+    gate:
+      - project-gate
diff --git a/tests/fixtures/layout-zuultrigger-merged.yaml b/tests/fixtures/layout-zuultrigger-merged.yaml
new file mode 100644
index 0000000..657700d
--- /dev/null
+++ b/tests/fixtures/layout-zuultrigger-merged.yaml
@@ -0,0 +1,53 @@
+pipelines:
+  - name: check
+    manager: IndependentPipelineManager
+    source: gerrit
+    trigger:
+      gerrit:
+        - event: patchset-created
+    success:
+      gerrit:
+        verified: 1
+    failure:
+      gerrit:
+        verified: -1
+
+  - name: gate
+    manager: DependentPipelineManager
+    failure-message: Build failed.  For information on how to proceed, see http://wiki.example.org/Test_Failures
+    source: gerrit
+    trigger:
+      gerrit:
+        - event: comment-added
+          approval:
+            - approved: 1
+    success:
+      gerrit:
+        verified: 2
+        submit: true
+    failure:
+      gerrit:
+        verified: -2
+    start:
+      gerrit:
+        verified: 0
+    precedence: high
+
+  - name: merge-check
+    manager: IndependentPipelineManager
+    source: gerrit
+    trigger:
+      zuul:
+        - event: project-change-merged
+    merge-failure:
+      gerrit:
+        verified: -1
+
+projects:
+  - name: org/project
+    check:
+      - project-check
+    gate:
+      - project-gate
+    merge-check:
+      - noop
diff --git a/tests/fixtures/layouts/bad_merge_failure.yaml b/tests/fixtures/layouts/bad_merge_failure.yaml
index 313d23b..fc6854e 100644
--- a/tests/fixtures/layouts/bad_merge_failure.yaml
+++ b/tests/fixtures/layouts/bad_merge_failure.yaml
@@ -10,6 +10,7 @@
     failure:
       gerrit:
         verified: -1
+    # merge-failure-message needs a string.
     merge-failure-message:
 
   - name: gate
diff --git a/tests/fixtures/layouts/bad_pipelines b/tests/fixtures/layouts/bad_pipelines
deleted file mode 100644
index f627208..0000000
--- a/tests/fixtures/layouts/bad_pipelines
+++ /dev/null
@@ -1 +0,0 @@
-pipelines:
diff --git a/tests/fixtures/layouts/bad_pipelines1.yaml b/tests/fixtures/layouts/bad_pipelines1.yaml
index 4207a2c..09638bc 100644
--- a/tests/fixtures/layouts/bad_pipelines1.yaml
+++ b/tests/fixtures/layouts/bad_pipelines1.yaml
@@ -1,4 +1,2 @@
+# Pipelines completely missing. At least one is required.
 pipelines:
-
-projects:
-  - name: foo
diff --git a/tests/fixtures/layouts/bad_pipelines10.yaml b/tests/fixtures/layouts/bad_pipelines10.yaml
index 5248c17..ddde946 100644
--- a/tests/fixtures/layouts/bad_pipelines10.yaml
+++ b/tests/fixtures/layouts/bad_pipelines10.yaml
@@ -4,4 +4,5 @@
 
 projects:
   - name: foo
-    merge-mode: foo
\ No newline at end of file
+    # merge-mode must be one of merge, merge-resolve, cherry-pick.
+    merge-mode: foo
diff --git a/tests/fixtures/layouts/bad_pipelines2.yaml b/tests/fixtures/layouts/bad_pipelines2.yaml
index e75a561..fc1e154 100644
--- a/tests/fixtures/layouts/bad_pipelines2.yaml
+++ b/tests/fixtures/layouts/bad_pipelines2.yaml
@@ -1,4 +1,5 @@
 pipelines:
+  # name is required for pipelines
   - noname: check
     manager: IndependentPipelineManager
 
diff --git a/tests/fixtures/layouts/bad_pipelines3.yaml b/tests/fixtures/layouts/bad_pipelines3.yaml
index 0c11a85..93ac266 100644
--- a/tests/fixtures/layouts/bad_pipelines3.yaml
+++ b/tests/fixtures/layouts/bad_pipelines3.yaml
@@ -1,5 +1,7 @@
 pipelines:
   - name: check
+    # The manager must be one of IndependentPipelineManager
+    # or DependentPipelineManager
     manager: NonexistentPipelineManager
 
 projects:
diff --git a/tests/fixtures/layouts/bad_pipelines4.yaml b/tests/fixtures/layouts/bad_pipelines4.yaml
index 7f58024..3a91604 100644
--- a/tests/fixtures/layouts/bad_pipelines4.yaml
+++ b/tests/fixtures/layouts/bad_pipelines4.yaml
@@ -3,6 +3,7 @@
     manager: IndependentPipelineManager
     trigger:
       gerrit:
+        # non-event is not a valid gerrit event
         - event: non-event
 
 projects:
diff --git a/tests/fixtures/layouts/bad_pipelines5.yaml b/tests/fixtures/layouts/bad_pipelines5.yaml
index 929c1a9..f95a78e 100644
--- a/tests/fixtures/layouts/bad_pipelines5.yaml
+++ b/tests/fixtures/layouts/bad_pipelines5.yaml
@@ -3,6 +3,7 @@
     manager: IndependentPipelineManager
     trigger:
       gerrit:
+        # event is a required item but it is missing.
         - approval:
             - approved: 1
 
diff --git a/tests/fixtures/layouts/bad_pipelines6.yaml b/tests/fixtures/layouts/bad_pipelines6.yaml
index 6dcdaf3..aa91c77 100644
--- a/tests/fixtures/layouts/bad_pipelines6.yaml
+++ b/tests/fixtures/layouts/bad_pipelines6.yaml
@@ -4,6 +4,7 @@
     trigger:
       gerrit:
         - event: comment-added
+          # approved is not a valid entry. Should be approval.
           approved: 1
 
 projects:
diff --git a/tests/fixtures/layouts/bad_pipelines7.yaml b/tests/fixtures/layouts/bad_pipelines7.yaml
index 7517b9a..e2db495 100644
--- a/tests/fixtures/layouts/bad_pipelines7.yaml
+++ b/tests/fixtures/layouts/bad_pipelines7.yaml
@@ -1,4 +1,5 @@
 pipelines:
+  # The pipeline must have a name.
   - manager: IndependentPipelineManager
 
 projects:
diff --git a/tests/fixtures/layouts/bad_pipelines8.yaml b/tests/fixtures/layouts/bad_pipelines8.yaml
index eeab038..9c5918e 100644
--- a/tests/fixtures/layouts/bad_pipelines8.yaml
+++ b/tests/fixtures/layouts/bad_pipelines8.yaml
@@ -1,4 +1,5 @@
 pipelines:
+  # The pipeline must have a manager
   - name: check
 
 projects:
diff --git a/tests/fixtures/layouts/bad_pipelines9.yaml b/tests/fixtures/layouts/bad_pipelines9.yaml
index ebb2e1f..89307d5 100644
--- a/tests/fixtures/layouts/bad_pipelines9.yaml
+++ b/tests/fixtures/layouts/bad_pipelines9.yaml
@@ -1,4 +1,5 @@
 pipelines:
+  # Names must be unique.
   - name: check
     manager: IndependentPipelineManager
   - name: check
diff --git a/tests/fixtures/layouts/bad_projects1.yaml b/tests/fixtures/layouts/bad_projects1.yaml
index c210c43..e3d381f 100644
--- a/tests/fixtures/layouts/bad_projects1.yaml
+++ b/tests/fixtures/layouts/bad_projects1.yaml
@@ -4,6 +4,7 @@
 
 projects:
   - name: foo
+  # gate pipeline is not defined.
     gate:
       - test
 
diff --git a/tests/fixtures/layouts/bad_projects2.yaml b/tests/fixtures/layouts/bad_projects2.yaml
index b91ed9d..9291cc9 100644
--- a/tests/fixtures/layouts/bad_projects2.yaml
+++ b/tests/fixtures/layouts/bad_projects2.yaml
@@ -5,5 +5,6 @@
 projects:
   - name: foo
     check:
+      # Indentation is one level too deep on the last line.
       - test
         - foo
diff --git a/tests/fixtures/layouts/bad_swift.yaml b/tests/fixtures/layouts/bad_swift.yaml
index d8a8c3f..e79dca6 100644
--- a/tests/fixtures/layouts/bad_swift.yaml
+++ b/tests/fixtures/layouts/bad_swift.yaml
@@ -16,11 +16,10 @@
     swift:
       - name: logs
   - name: ^.*-merge$
+    # swift requires a name
     swift:
         container: merge_assets
     failure-message: Unable to merge change
-  - name: test-test
-    swift:
 
 projects:
   - name: test-org/test
diff --git a/tests/fixtures/layouts/bad_template1.yaml b/tests/fixtures/layouts/bad_template1.yaml
index 15822d1..cab17a1 100644
--- a/tests/fixtures/layouts/bad_template1.yaml
+++ b/tests/fixtures/layouts/bad_template1.yaml
@@ -10,7 +10,7 @@
 project-templates:
   - name: template-generic
     check:
-     # Template uses the 'project' parameter' which must
+     # Template uses the 'project' parameter' which must be provided
      - '{project}-merge'
 
 projects:
diff --git a/tests/fixtures/layouts/bad_template3.yaml b/tests/fixtures/layouts/bad_template3.yaml
index 70412b8..54697c4 100644
--- a/tests/fixtures/layouts/bad_template3.yaml
+++ b/tests/fixtures/layouts/bad_template3.yaml
@@ -1,8 +1,5 @@
 # Template refers to an unexisting pipeline
 
-pipelines:
-  # We have no pipelines at all
-
 project-templates:
   - name: template-generic
     unexisting-pipeline:  # pipeline does not exist
diff --git a/tests/test_cloner.py b/tests/test_cloner.py
index bb9d91f..ab2683d 100644
--- a/tests/test_cloner.py
+++ b/tests/test_cloner.py
@@ -41,21 +41,88 @@
         self.workspace_root = os.path.join(self.test_root, 'workspace')
 
         self.config.set('zuul', 'layout_config',
-                        'tests/fixtures/layout-gating.yaml')
+                        'tests/fixtures/layout-cloner.yaml')
         self.sched.reconfigure(self.config)
         self.registerJobs()
 
-    def test_cloner(self):
+    def getWorkspaceRepos(self, projects):
+        repos = {}
+        for project in projects:
+            repos[project] = git.Repo(
+                os.path.join(self.workspace_root, project))
+        return repos
+
+    def getUpstreamRepos(self, projects):
+        repos = {}
+        for project in projects:
+            repos[project] = git.Repo(
+                os.path.join(self.upstream_root, project))
+        return repos
+
+    def test_cache_dir(self):
+        projects = ['org/project1', 'org/project2']
+        cache_root = os.path.join(self.test_root, "cache")
+        for project in projects:
+            upstream_repo_path = os.path.join(self.upstream_root, project)
+            cache_repo_path = os.path.join(cache_root, project)
+            git.Repo.clone_from(upstream_repo_path, cache_repo_path)
+
+        self.worker.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        A.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
+
+        self.waitUntilSettled()
+
+        self.assertEquals(1, len(self.builds), "One build is running")
+
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        B.setMerged()
+
+        upstream = self.getUpstreamRepos(projects)
+        states = [
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('master')),
+             },
+            ]
+
+        for number, build in enumerate(self.builds):
+            self.log.debug("Build parameters: %s", build.parameters)
+            cloner = zuul.lib.cloner.Cloner(
+                git_base_url=self.upstream_root,
+                projects=projects,
+                workspace=self.workspace_root,
+                zuul_branch=build.parameters['ZUUL_BRANCH'],
+                zuul_ref=build.parameters['ZUUL_REF'],
+                zuul_url=self.git_root,
+                cache_dir=cache_root,
+                )
+            cloner.execute()
+            work = self.getWorkspaceRepos(projects)
+            state = states[number]
+
+            for project in projects:
+                self.assertEquals(state[project],
+                                  str(work[project].commit('HEAD')),
+                                  'Project %s commit for build %s should '
+                                  'be correct' % (project, number))
+
+        work = self.getWorkspaceRepos(projects)
+        upstream_repo_path = os.path.join(self.upstream_root, 'org/project1')
+        self.assertEquals(work['org/project1'].remotes.origin.url,
+                          upstream_repo_path,
+                          'workspace repo origin should be upstream, not cache')
+
+        self.worker.hold_jobs_in_build = False
+        self.worker.release()
+        self.waitUntilSettled()
+
+    def test_one_branch(self):
         self.worker.hold_jobs_in_build = True
 
+        projects = ['org/project1', 'org/project2']
         A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
         B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-
-        A.addPatchset(['project_one.txt'])
-        B.addPatchset(['project_two.txt'])
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-
         A.addApproval('CRVW', 2)
         B.addApproval('CRVW', 2)
         self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
@@ -65,39 +132,352 @@
 
         self.assertEquals(2, len(self.builds), "Two builds are running")
 
-        a_zuul_ref = b_zuul_ref = None
-        for build in self.builds:
+        upstream = self.getUpstreamRepos(projects)
+        states = [
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[1].parameters['ZUUL_COMMIT'],
+             },
+            ]
+
+        for number, build in enumerate(self.builds):
             self.log.debug("Build parameters: %s", build.parameters)
-            if build.parameters['ZUUL_CHANGE'] == '1':
-                a_zuul_ref = build.parameters['ZUUL_REF']
-                a_zuul_commit = build.parameters['ZUUL_COMMIT']
-            if build.parameters['ZUUL_CHANGE'] == '2':
-                b_zuul_ref = build.parameters['ZUUL_REF']
-                b_zuul_commit = build.parameters['ZUUL_COMMIT']
+            cloner = zuul.lib.cloner.Cloner(
+                git_base_url=self.upstream_root,
+                projects=projects,
+                workspace=self.workspace_root,
+                zuul_branch=build.parameters['ZUUL_BRANCH'],
+                zuul_ref=build.parameters['ZUUL_REF'],
+                zuul_url=self.git_root,
+                )
+            cloner.execute()
+            work = self.getWorkspaceRepos(projects)
+            state = states[number]
+
+            for project in projects:
+                self.assertEquals(state[project],
+                                  str(work[project].commit('HEAD')),
+                                  'Project %s commit for build %s should '
+                                  'be correct' % (project, number))
+
+            shutil.rmtree(self.workspace_root)
 
         self.worker.hold_jobs_in_build = False
         self.worker.release()
         self.waitUntilSettled()
 
-        # Repos setup, now test the cloner
-        for zuul_ref in [a_zuul_ref, b_zuul_ref]:
+    def test_multi_branch(self):
+        self.worker.hold_jobs_in_build = True
+        projects = ['org/project1', 'org/project2',
+                    'org/project3', 'org/project4']
+
+        self.create_branch('org/project2', 'stable/havana')
+        self.create_branch('org/project4', 'stable/havana')
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'stable/havana', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project3', 'master', 'C')
+        A.addApproval('CRVW', 2)
+        B.addApproval('CRVW', 2)
+        C.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(B.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(C.addApproval('APRV', 1))
+
+        self.waitUntilSettled()
+
+        self.assertEquals(3, len(self.builds), "Three builds are running")
+
+        upstream = self.getUpstreamRepos(projects)
+        states = [
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('master')),
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].
+                                 commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].
+                                 commit('stable/havana')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('master')),
+             'org/project3': self.builds[2].parameters['ZUUL_COMMIT'],
+             'org/project4': str(upstream['org/project4'].
+                                 commit('master')),
+             },
+            ]
+
+        for number, build in enumerate(self.builds):
+            self.log.debug("Build parameters: %s", build.parameters)
             cloner = zuul.lib.cloner.Cloner(
                 git_base_url=self.upstream_root,
-                projects=['org/project1', 'org/project2'],
+                projects=projects,
                 workspace=self.workspace_root,
-                zuul_branch='master',
-                zuul_ref=zuul_ref,
+                zuul_branch=build.parameters['ZUUL_BRANCH'],
+                zuul_ref=build.parameters['ZUUL_REF'],
                 zuul_url=self.git_root,
-                branch='master',
-                clone_map_file=os.path.join(FIXTURE_DIR, 'clonemap.yaml')
-            )
+                )
             cloner.execute()
-            work_repo1 = git.Repo(os.path.join(self.workspace_root,
-                                               'org/project1'))
-            self.assertEquals(a_zuul_commit, str(work_repo1.commit('HEAD')))
+            work = self.getWorkspaceRepos(projects)
+            state = states[number]
 
-            work_repo2 = git.Repo(os.path.join(self.workspace_root,
-                                               'org/project2'))
-            self.assertEquals(b_zuul_commit, str(work_repo2.commit('HEAD')))
-
+            for project in projects:
+                self.assertEquals(state[project],
+                                  str(work[project].commit('HEAD')),
+                                  'Project %s commit for build %s should '
+                                  'be correct' % (project, number))
             shutil.rmtree(self.workspace_root)
+
+        self.worker.hold_jobs_in_build = False
+        self.worker.release()
+        self.waitUntilSettled()
+
+    def test_upgrade(self):
+        # Simulates an upgrade test
+        self.worker.hold_jobs_in_build = True
+        projects = ['org/project1', 'org/project2', 'org/project3',
+                    'org/project4', 'org/project5', 'org/project6']
+
+        self.create_branch('org/project2', 'stable/havana')
+        self.create_branch('org/project3', 'stable/havana')
+        self.create_branch('org/project4', 'stable/havana')
+        self.create_branch('org/project5', 'stable/havana')
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project3', 'stable/havana', 'C')
+        D = self.fake_gerrit.addFakeChange('org/project3', 'master', 'D')
+        E = self.fake_gerrit.addFakeChange('org/project4', 'stable/havana', 'E')
+        A.addApproval('CRVW', 2)
+        B.addApproval('CRVW', 2)
+        C.addApproval('CRVW', 2)
+        D.addApproval('CRVW', 2)
+        E.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(B.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(C.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(D.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(E.addApproval('APRV', 1))
+
+        self.waitUntilSettled()
+
+        self.assertEquals(5, len(self.builds), "Five builds are running")
+
+        # Check the old side of the upgrade first
+        upstream = self.getUpstreamRepos(projects)
+        states = [
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('stable/havana')),
+             'org/project3': str(upstream['org/project3'].commit('stable/havana')),
+             'org/project4': str(upstream['org/project4'].commit('stable/havana')),
+             'org/project5': str(upstream['org/project5'].commit('stable/havana')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('stable/havana')),
+             'org/project3': str(upstream['org/project3'].commit('stable/havana')),
+             'org/project4': str(upstream['org/project4'].commit('stable/havana')),
+             'org/project5': str(upstream['org/project5'].commit('stable/havana')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('stable/havana')),
+             'org/project3': self.builds[2].parameters['ZUUL_COMMIT'],
+             'org/project4': str(upstream['org/project4'].commit('stable/havana')),
+
+             'org/project5': str(upstream['org/project5'].commit('stable/havana')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('stable/havana')),
+             'org/project3': self.builds[2].parameters['ZUUL_COMMIT'],
+             'org/project4': str(upstream['org/project4'].commit('stable/havana')),
+             'org/project5': str(upstream['org/project5'].commit('stable/havana')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('stable/havana')),
+             'org/project3': self.builds[2].parameters['ZUUL_COMMIT'],
+             'org/project4': self.builds[4].parameters['ZUUL_COMMIT'],
+             'org/project5': str(upstream['org/project5'].commit('stable/havana')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            ]
+
+        for number, build in enumerate(self.builds):
+            self.log.debug("Build parameters: %s", build.parameters)
+            change_number = int(build.parameters['ZUUL_CHANGE'])
+            cloner = zuul.lib.cloner.Cloner(
+                git_base_url=self.upstream_root,
+                projects=projects,
+                workspace=self.workspace_root,
+                zuul_branch=build.parameters['ZUUL_BRANCH'],
+                zuul_ref=build.parameters['ZUUL_REF'],
+                zuul_url=self.git_root,
+                branch='stable/havana', # Old branch for upgrade
+                )
+            cloner.execute()
+            work = self.getWorkspaceRepos(projects)
+            state = states[number]
+
+            for project in projects:
+                self.assertEquals(state[project],
+                                  str(work[project].commit('HEAD')),
+                                  'Project %s commit for build %s should '
+                                  'be correct on old side of upgrade' %
+                                  (project, number))
+            shutil.rmtree(self.workspace_root)
+
+        # Check the new side of the upgrade
+        states = [
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('master')),
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project3': self.builds[3].parameters['ZUUL_COMMIT'],
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project3': self.builds[3].parameters['ZUUL_COMMIT'],
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            ]
+
+        for number, build in enumerate(self.builds):
+            self.log.debug("Build parameters: %s", build.parameters)
+            change_number = int(build.parameters['ZUUL_CHANGE'])
+            cloner = zuul.lib.cloner.Cloner(
+                git_base_url=self.upstream_root,
+                projects=projects,
+                workspace=self.workspace_root,
+                zuul_branch=build.parameters['ZUUL_BRANCH'],
+                zuul_ref=build.parameters['ZUUL_REF'],
+                zuul_url=self.git_root,
+                branch='master', # New branch for upgrade
+                )
+            cloner.execute()
+            work = self.getWorkspaceRepos(projects)
+            state = states[number]
+
+            for project in projects:
+                self.assertEquals(state[project],
+                                  str(work[project].commit('HEAD')),
+                                  'Project %s commit for build %s should '
+                                  'be correct on old side of upgrade' %
+                                  (project, number))
+            shutil.rmtree(self.workspace_root)
+
+        self.worker.hold_jobs_in_build = False
+        self.worker.release()
+        self.waitUntilSettled()
+
+    def test_project_override(self):
+        self.worker.hold_jobs_in_build = True
+        projects = ['org/project1', 'org/project2', 'org/project3',
+                    'org/project4', 'org/project5', 'org/project6']
+
+        self.create_branch('org/project3', 'stable/havana')
+        self.create_branch('org/project4', 'stable/havana')
+        self.create_branch('org/project6', 'stable/havana')
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
+        D = self.fake_gerrit.addFakeChange('org/project3', 'stable/havana', 'D')
+        A.addApproval('CRVW', 2)
+        B.addApproval('CRVW', 2)
+        C.addApproval('CRVW', 2)
+        D.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(B.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(C.addApproval('APRV', 1))
+        self.fake_gerrit.addEvent(D.addApproval('APRV', 1))
+
+        self.waitUntilSettled()
+
+        self.assertEquals(4, len(self.builds), "Four builds are running")
+
+        upstream = self.getUpstreamRepos(projects)
+        states = [
+            {'org/project1': self.builds[0].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('master')),
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project2': str(upstream['org/project2'].commit('master')),
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[2].parameters['ZUUL_COMMIT'],
+             'org/project3': str(upstream['org/project3'].commit('master')),
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('master')),
+             },
+            {'org/project1': self.builds[1].parameters['ZUUL_COMMIT'],
+             'org/project2': self.builds[2].parameters['ZUUL_COMMIT'],
+             'org/project3': self.builds[3].parameters['ZUUL_COMMIT'],
+             'org/project4': str(upstream['org/project4'].commit('master')),
+             'org/project5': str(upstream['org/project5'].commit('master')),
+             'org/project6': str(upstream['org/project6'].commit('stable/havana')),
+             },
+            ]
+
+        for number, build in enumerate(self.builds):
+            self.log.debug("Build parameters: %s", build.parameters)
+            change_number = int(build.parameters['ZUUL_CHANGE'])
+            cloner = zuul.lib.cloner.Cloner(
+                git_base_url=self.upstream_root,
+                projects=projects,
+                workspace=self.workspace_root,
+                zuul_branch=build.parameters['ZUUL_BRANCH'],
+                zuul_ref=build.parameters['ZUUL_REF'],
+                zuul_url=self.git_root,
+                project_branches={'org/project4': 'master'},
+                )
+            cloner.execute()
+            work = self.getWorkspaceRepos(projects)
+            state = states[number]
+
+            for project in projects:
+                self.assertEquals(state[project],
+                                  str(work[project].commit('HEAD')),
+                                  'Project %s commit for build %s should '
+                                  'be correct' % (project, number))
+            shutil.rmtree(self.workspace_root)
+
+        self.worker.hold_jobs_in_build = False
+        self.worker.release()
+        self.waitUntilSettled()
diff --git a/tests/test_layoutvalidator.py b/tests/test_layoutvalidator.py
index 7e9f1d5..5a8fc46 100644
--- a/tests/test_layoutvalidator.py
+++ b/tests/test_layoutvalidator.py
@@ -57,7 +57,7 @@
                     error = str(e)
                     print '  ', error
                     if error in errors:
-                        raise Exception("Error has already beed tested: %s" %
+                        raise Exception("Error has already been tested: %s" %
                                         error)
                     else:
                         errors.append(error)
diff --git a/tests/test_scheduler.py b/tests/test_scheduler.py
index c6c608f..a7548c1 100755
--- a/tests/test_scheduler.py
+++ b/tests/test_scheduler.py
@@ -651,19 +651,19 @@
         "Test whether a change is ready to merge"
         # TODO: move to test_gerrit (this is a unit test!)
         A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
-        trigger = self.sched.layout.pipelines['gate'].trigger
-        a = self.sched.triggers['gerrit'].getChange(1, 2)
+        source = self.sched.layout.pipelines['gate'].source
+        a = source._getChange(1, 2)
         mgr = self.sched.layout.pipelines['gate'].manager
-        self.assertFalse(trigger.canMerge(a, mgr.getSubmitAllowNeeds()))
+        self.assertFalse(source.canMerge(a, mgr.getSubmitAllowNeeds()))
 
         A.addApproval('CRVW', 2)
-        a = trigger.getChange(1, 2, refresh=True)
-        self.assertFalse(trigger.canMerge(a, mgr.getSubmitAllowNeeds()))
+        a = source._getChange(1, 2, refresh=True)
+        self.assertFalse(source.canMerge(a, mgr.getSubmitAllowNeeds()))
 
         A.addApproval('APRV', 1)
-        a = trigger.getChange(1, 2, refresh=True)
-        self.assertTrue(trigger.canMerge(a, mgr.getSubmitAllowNeeds()))
-        trigger.maintainCache([])
+        a = source._getChange(1, 2, refresh=True)
+        self.assertTrue(source.canMerge(a, mgr.getSubmitAllowNeeds()))
+        source.maintainCache([])
 
     def test_build_configuration(self):
         "Test that zuul merges the right commits for testing"
@@ -1742,6 +1742,7 @@
         sched = zuul.scheduler.Scheduler()
         sched.registerTrigger(None, 'gerrit')
         sched.registerTrigger(None, 'timer')
+        sched.registerTrigger(None, 'zuul')
         sched.testConfig(self.config.get('zuul', 'layout_config'))
 
     def test_build_description(self):
@@ -2130,6 +2131,7 @@
                         'tests/fixtures/layout-no-timer.yaml')
         self.sched.reconfigure(self.config)
         self.registerJobs()
+        self.waitUntilSettled()
         self.worker.release('.*')
         self.waitUntilSettled()
 
diff --git a/tests/test_webapp.py b/tests/test_webapp.py
new file mode 100644
index 0000000..b127c51
--- /dev/null
+++ b/tests/test_webapp.py
@@ -0,0 +1,85 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Hewlett-Packard Development Company, L.P.
+# Copyright 2014 Rackspace Australia
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import json
+import urllib2
+
+from tests.base import ZuulTestCase
+
+
+class TestWebapp(ZuulTestCase):
+
+    def _cleanup(self):
+        self.worker.hold_jobs_in_build = False
+        self.worker.release()
+        self.waitUntilSettled()
+
+    def setUp(self):
+        super(TestWebapp, self).setUp()
+        self.addCleanup(self._cleanup)
+        self.worker.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        A.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+        B.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(B.addApproval('APRV', 1))
+        self.waitUntilSettled()
+        self.port = self.webapp.server.socket.getsockname()[1]
+
+    def test_webapp_status(self):
+        "Test that we can filter to only certain changes in the webapp."
+
+        req = urllib2.Request(
+            "http://localhost:%s/status" % self.port)
+        f = urllib2.urlopen(req)
+        data = json.loads(f.read())
+
+        self.assertIn('pipelines', data)
+
+    def test_webapp_status_compat(self):
+        # testing compat with status.json
+        req = urllib2.Request(
+            "http://localhost:%s/status.json" % self.port)
+        f = urllib2.urlopen(req)
+        data = json.loads(f.read())
+
+        self.assertIn('pipelines', data)
+
+    def test_webapp_bad_url(self):
+        # do we 404 correctly
+        req = urllib2.Request(
+            "http://localhost:%s/status/foo" % self.port)
+        self.assertRaises(urllib2.HTTPError, urllib2.urlopen, req)
+
+    def test_webapp_find_change(self):
+        # can we filter by change id
+        req = urllib2.Request(
+            "http://localhost:%s/status/change/1,1" % self.port)
+        f = urllib2.urlopen(req)
+        data = json.loads(f.read())
+
+        self.assertEqual(1, len(data), data)
+        self.assertEqual("org/project", data[0]['project'])
+
+        req = urllib2.Request(
+            "http://localhost:%s/status/change/2,1" % self.port)
+        f = urllib2.urlopen(req)
+        data = json.loads(f.read())
+
+        self.assertEqual(1, len(data), data)
+        self.assertEqual("org/project1", data[0]['project'], data)
diff --git a/tests/test_zuultrigger.py b/tests/test_zuultrigger.py
new file mode 100644
index 0000000..9a90a98
--- /dev/null
+++ b/tests/test_zuultrigger.py
@@ -0,0 +1,136 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Hewlett-Packard Development Company, L.P.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import logging
+import time
+
+from tests.base import ZuulTestCase
+
+logging.basicConfig(level=logging.DEBUG,
+                    format='%(asctime)s %(name)-32s '
+                    '%(levelname)-8s %(message)s')
+
+
+class TestZuulTrigger(ZuulTestCase):
+    """Test Zuul Trigger"""
+
+    def test_zuul_trigger_parent_change_enqueued(self):
+        "Test Zuul trigger event: parent-change-enqueued"
+        self.config.set('zuul', 'layout_config',
+                        'tests/fixtures/layout-zuultrigger-enqueued.yaml')
+        self.sched.reconfigure(self.config)
+        self.registerJobs()
+
+        # This test has the following three changes:
+        # B1 -> A; B2 -> A
+        # When A is enqueued in the gate, B1 and B2 should both attempt
+        # to be enqueued in both pipelines.  B1 should end up in check
+        # and B2 in gate because of differing pipeline requirements.
+        self.worker.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        B1 = self.fake_gerrit.addFakeChange('org/project', 'master', 'B1')
+        B2 = self.fake_gerrit.addFakeChange('org/project', 'master', 'B2')
+        A.addApproval('CRVW', 2)
+        B1.addApproval('CRVW', 2)
+        B2.addApproval('CRVW', 2)
+        A.addApproval('VRFY', 1)  # required by gate
+        B1.addApproval('VRFY', -1) # should go to check
+        B2.addApproval('VRFY', 1)  # should go to gate
+        B1.addApproval('APRV', 1)
+        B2.addApproval('APRV', 1)
+        B1.setDependsOn(A, 1)
+        B2.setDependsOn(A, 1)
+        self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
+        # Jobs are being held in build to make sure that 3,1 has time
+        # to enqueue behind 1,1 so that the test is more
+        # deterministic.
+        self.waitUntilSettled()
+        self.worker.hold_jobs_in_build = False
+        self.worker.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(len(self.history), 3)
+        for job in self.history:
+            if job.changes == '1,1':
+                self.assertEqual(job.name, 'project-gate')
+            elif job.changes == '2,1':
+                self.assertEqual(job.name, 'project-check')
+            elif job.changes == '1,1 3,1':
+                self.assertEqual(job.name, 'project-gate')
+            else:
+                raise Exception("Unknown job")
+
+    def test_zuul_trigger_project_change_merged(self):
+        "Test Zuul trigger event: project-change-merged"
+        self.config.set('zuul', 'layout_config',
+                        'tests/fixtures/layout-zuultrigger-merged.yaml')
+        self.sched.reconfigure(self.config)
+        self.registerJobs()
+
+        # This test has the following three changes:
+        # A, B, C;  B conflicts with A, but C does not.
+        # When A is merged, B and C should be checked for conflicts,
+        # and B should receive a -1.
+        # D and E are used to repeat the test in the second part, but
+        # are defined here to that they end up in the trigger cache.
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project', 'master', 'C')
+        D = self.fake_gerrit.addFakeChange('org/project', 'master', 'D')
+        E = self.fake_gerrit.addFakeChange('org/project', 'master', 'E')
+        A.addPatchset(['conflict'])
+        B.addPatchset(['conflict'])
+        D.addPatchset(['conflict2'])
+        E.addPatchset(['conflict2'])
+        A.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(len(self.history), 1)
+        self.assertEqual(self.history[0].name, 'project-gate')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 1)
+        self.assertEqual(C.reported, 0)
+        self.assertEqual(D.reported, 0)
+        self.assertEqual(E.reported, 0)
+        self.assertEqual(B.messages[0],
+            "Merge Failed.\n\nThis change was unable to be automatically "
+            "merged with the current state of the repository. Please rebase "
+            "your change and upload a new patchset.")
+        self.assertEqual(self.fake_gerrit.queries[0], "project:org/project status:open")
+
+        # Reconfigure and run the test again.  This is a regression
+        # check to make sure that we don't end up with a stale trigger
+        # cache that has references to projects from the old
+        # configuration.
+        self.sched.reconfigure(self.config)
+
+        D.addApproval('CRVW', 2)
+        self.fake_gerrit.addEvent(D.addApproval('APRV', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(len(self.history), 2)
+        self.assertEqual(self.history[1].name, 'project-gate')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 1)
+        self.assertEqual(C.reported, 0)
+        self.assertEqual(D.reported, 2)
+        self.assertEqual(E.reported, 1)
+        self.assertEqual(E.messages[0],
+            "Merge Failed.\n\nThis change was unable to be automatically "
+            "merged with the current state of the repository. Please rebase "
+            "your change and upload a new patchset.")
+        self.assertEqual(self.fake_gerrit.queries[1], "project:org/project status:open")
diff --git a/tox.ini b/tox.ini
index 3510ddb..6e45969 100644
--- a/tox.ini
+++ b/tox.ini
@@ -26,7 +26,7 @@
 commands =
   python setup.py testr --coverage
 
-[testenv:doc]
+[testenv:docs]
 commands = python setup.py build_sphinx
 
 [testenv:venv]
diff --git a/zuul/cmd/cloner.py b/zuul/cmd/cloner.py
index 1310c16..a895f24 100755
--- a/zuul/cmd/cloner.py
+++ b/zuul/cmd/cloner.py
@@ -54,6 +54,10 @@
         parser.add_argument('--version', dest='version', action='version',
                             version=self._get_version(),
                             help='show zuul version')
+        parser.add_argument('--cache-dir', dest='cache_dir',
+                            help=('a directory that holds cached copies of '
+                                  'repos from which to make an initial clone.'
+                                  ))
         parser.add_argument('git_base_url',
                             help='reference repo to clone from')
         parser.add_argument('projects', nargs='+',
@@ -61,17 +65,24 @@
 
         project_env = parser.add_argument_group(
             'project tuning'
-        )
+            )
         project_env.add_argument(
             '--branch',
             help=('branch to checkout instead of Zuul selected branch, '
                   'for example to specify an alternate branch to test '
                   'client library compatibility.')
-        )
+            )
+        project_env.add_argument(
+            '--project-branch', nargs=1, action='append',
+            metavar='PROJECT=BRANCH',
+            help=('project-specific branch to checkout which takes precedence '
+                  'over --branch if it is provided; may be specified multiple '
+                  'times.')
+            )
 
         zuul_env = parser.add_argument_group(
             'zuul environnement',
-            'Let you override $ZUUL_* environnement variables.'
+            'Let you override $ZUUL_* environment variables.'
         )
         for zuul_suffix in ZUUL_ENV_SUFFIXES:
             env_name = 'ZUUL_%s' % zuul_suffix.upper()
@@ -120,6 +131,11 @@
     def main(self):
         self.parse_arguments()
         self.setup_logging(color=self.args.color, verbose=self.args.verbose)
+        project_branches = {}
+        if self.args.project_branch:
+            for x in self.args.project_branch:
+                project, branch = x[0].split('=')
+                project_branches[project] = branch
         cloner = zuul.lib.cloner.Cloner(
             git_base_url=self.args.git_base_url,
             projects=self.args.projects,
@@ -128,7 +144,9 @@
             zuul_ref=self.args.zuul_ref,
             zuul_url=self.args.zuul_url,
             branch=self.args.branch,
-            clone_map_file=self.args.clone_map_file
+            clone_map_file=self.args.clone_map_file,
+            project_branches=project_branches,
+            cache_dir=self.args.cache_dir,
         )
         cloner.execute()
 
diff --git a/zuul/cmd/server.py b/zuul/cmd/server.py
index d7de85a..25dab6f 100755
--- a/zuul/cmd/server.py
+++ b/zuul/cmd/server.py
@@ -87,6 +87,7 @@
         self.sched.registerReporter(None, 'smtp')
         self.sched.registerTrigger(None, 'gerrit')
         self.sched.registerTrigger(None, 'timer')
+        self.sched.registerTrigger(None, 'zuul')
         layout = self.sched.testConfig(self.config.get('zuul',
                                                        'layout_config'))
         if not job_list_path:
@@ -145,6 +146,7 @@
         import zuul.reporter.smtp
         import zuul.trigger.gerrit
         import zuul.trigger.timer
+        import zuul.trigger.zuultrigger
         import zuul.webapp
         import zuul.rpclistener
 
@@ -163,6 +165,7 @@
         merger = zuul.merger.client.MergeClient(self.config, self.sched)
         gerrit = zuul.trigger.gerrit.Gerrit(self.config, self.sched)
         timer = zuul.trigger.timer.Timer(self.config, self.sched)
+        zuultrigger = zuul.trigger.zuultrigger.ZuulTrigger(self.config, self.sched)
         if self.config.has_option('zuul', 'status_expiry'):
             cache_expiry = self.config.getint('zuul', 'status_expiry')
         else:
@@ -185,6 +188,7 @@
         self.sched.setMerger(merger)
         self.sched.registerTrigger(gerrit)
         self.sched.registerTrigger(timer)
+        self.sched.registerTrigger(zuultrigger)
         self.sched.registerReporter(gerrit_reporter)
         self.sched.registerReporter(smtp_reporter)
 
diff --git a/zuul/layoutvalidator.py b/zuul/layoutvalidator.py
index b445908..6969653 100644
--- a/zuul/layoutvalidator.py
+++ b/zuul/layoutvalidator.py
@@ -66,8 +66,16 @@
 
     timer_trigger = {v.Required('time'): str}
 
-    trigger = v.Required(v.Any({'gerrit': toList(gerrit_trigger)},
-                               {'timer': toList(timer_trigger)}))
+    zuul_trigger = {v.Required('event'):
+                    toList(v.Any('parent-change-enqueued',
+                                 'project-change-merged')),
+                    'pipeline': toList(str),
+                    'require-approval': toList(require_approval),
+                    }
+
+    trigger = v.Required({'gerrit': toList(gerrit_trigger),
+                          'timer': toList(timer_trigger),
+                          'zuul': toList(zuul_trigger)})
 
     report_actions = {'gerrit': variable_dict,
                       'smtp': {'to': str,
@@ -88,6 +96,7 @@
 
     pipeline = {v.Required('name'): str,
                 v.Required('manager'): manager,
+                'source': v.Any('gerrit'),
                 'precedence': precedence,
                 'description': str,
                 'require': require,
diff --git a/zuul/lib/cloner.py b/zuul/lib/cloner.py
index 0961eb4..89ebada 100644
--- a/zuul/lib/cloner.py
+++ b/zuul/lib/cloner.py
@@ -28,18 +28,21 @@
     log = logging.getLogger("zuul.Cloner")
 
     def __init__(self, git_base_url, projects, workspace, zuul_branch,
-                 zuul_ref, zuul_url, branch=None, clone_map_file=None):
+                 zuul_ref, zuul_url, branch=None, clone_map_file=None,
+                 project_branches=None, cache_dir=None):
 
         self.clone_map = []
         self.dests = None
 
         self.branch = branch
         self.git_url = git_base_url
+        self.cache_dir = cache_dir
         self.projects = projects
         self.workspace = workspace
         self.zuul_branch = zuul_branch
         self.zuul_ref = zuul_ref
         self.zuul_url = zuul_url
+        self.project_branches = project_branches or {}
 
         if clone_map_file:
             self.readCloneMap(clone_map_file)
@@ -64,9 +67,24 @@
         self.log.info("Prepared all repositories")
 
     def cloneUpstream(self, project, dest):
+        # Check for a cached git repo first
+        git_cache = '%s/%s' % (self.cache_dir, project)
         git_upstream = '%s/%s' % (self.git_url, project)
-        self.log.info("Creating repo %s from upstream %s",
-                      project, git_upstream)
+        if (self.cache_dir and
+            os.path.exists(git_cache) and
+            not os.path.exists(dest)):
+            # file:// tells git not to hard-link across repos
+            git_cache = 'file://%s' % git_cache
+            self.log.info("Creating repo %s from cache %s",
+                          project, git_cache)
+            new_repo = git.Repo.clone_from(git_cache, dest)
+            self.log.info("Updating origin remote in repo %s to %s",
+                          project, git_upstream)
+            origin = new_repo.remotes.origin.config_writer.set(
+                'url', git_upstream)
+        else:
+            self.log.info("Creating repo %s from upstream %s",
+                          project, git_upstream)
         repo = Repo(
             remote=git_upstream,
             local=dest,
@@ -98,6 +116,12 @@
          2) Zuul reference for the master branch
          3) The tip of the indicated branch
          4) The tip of the master branch
+
+        The "indicated branch" is one of the following:
+
+         A) The project-specific override branch (from project_branches arg)
+         B) The user specified branch (from the branch arg)
+         C) ZUUL_BRANCH (from the zuul_branch arg)
         """
 
         repo = self.cloneUpstream(project, dest)
@@ -106,22 +130,24 @@
         # Ensure that we don't have stale remotes around
         repo.prune()
 
-        override_zuul_ref = self.zuul_ref
-        # FIXME should be origin HEAD branch which might not be 'master'
-        fallback_branch = 'master'
-        fallback_zuul_ref = re.sub(self.zuul_branch, fallback_branch,
+        indicated_branch = self.branch or self.zuul_branch
+        if project in self.project_branches:
+            indicated_branch = self.project_branches[project]
+
+        override_zuul_ref = re.sub(self.zuul_branch, indicated_branch,
                                    self.zuul_ref)
 
-        if self.branch:
-            override_zuul_ref = re.sub(self.zuul_branch, self.branch,
-                                       self.zuul_ref)
-            if repo.hasBranch(self.branch):
-                self.log.debug("upstream repo has branch %s", self.branch)
-                fallback_branch = self.branch
-                fallback_zuul_ref = self.zuul_ref
-            else:
-                self.log.exception("upstream repo is missing branch %s",
-                                   self.branch)
+        if repo.hasBranch(indicated_branch):
+            self.log.debug("upstream repo has branch %s", indicated_branch)
+            fallback_branch = indicated_branch
+        else:
+            self.log.debug("upstream repo is missing branch %s",
+                           self.branch)
+            # FIXME should be origin HEAD branch which might not be 'master'
+            fallback_branch = 'master'
+
+        fallback_zuul_ref = re.sub(self.zuul_branch, fallback_branch,
+                                   self.zuul_ref)
 
         if (self.fetchFromZuul(repo, project, override_zuul_ref)
             or (fallback_zuul_ref != override_zuul_ref and
diff --git a/zuul/lib/gerrit.py b/zuul/lib/gerrit.py
index 30fb6fe..52e6057 100644
--- a/zuul/lib/gerrit.py
+++ b/zuul/lib/gerrit.py
@@ -144,6 +144,23 @@
                        (pprint.pformat(data)))
         return data
 
+    def simpleQuery(self, query):
+        args = '--current-patch-set'
+        cmd = 'gerrit query --format json %s %s' % (
+            args, query)
+        out, err = self._ssh(cmd)
+        if not out:
+            return False
+        lines = out.split('\n')
+        if not lines:
+            return False
+        data = [json.loads(line) for line in lines[:-1]]
+        if not data:
+            return False
+        self.log.debug("Received data from Gerrit query: \n%s" %
+                       (pprint.pformat(data)))
+        return data
+
     def _open(self):
         client = paramiko.SSHClient()
         client.load_system_host_keys()
diff --git a/zuul/merger/client.py b/zuul/merger/client.py
index 72fd4c5..8c41563 100644
--- a/zuul/merger/client.py
+++ b/zuul/merger/client.py
@@ -18,6 +18,8 @@
 
 import gear
 
+import zuul.model
+
 
 def getJobData(job):
     if not len(job.data):
@@ -79,23 +81,26 @@
             return True
         return False
 
-    def submitJob(self, name, data, build_set):
+    def submitJob(self, name, data, build_set,
+                  precedence=zuul.model.PRECEDENCE_NORMAL):
         uuid = str(uuid4().hex)
         self.log.debug("Submitting job %s with data %s" % (name, data))
         job = gear.Job(name,
                        json.dumps(data),
                        unique=uuid)
         self.build_sets[uuid] = build_set
-        self.gearman.submitJob(job)
+        self.gearman.submitJob(job, precedence=precedence)
 
-    def mergeChanges(self, items, build_set):
+    def mergeChanges(self, items, build_set,
+                     precedence=zuul.model.PRECEDENCE_NORMAL):
         data = dict(items=items)
-        self.submitJob('merger:merge', data, build_set)
+        self.submitJob('merger:merge', data, build_set, precedence)
 
-    def updateRepo(self, project, url, build_set):
+    def updateRepo(self, project, url, build_set,
+                   precedence=zuul.model.PRECEDENCE_NORMAL):
         data = dict(project=project,
                     url=url)
-        self.submitJob('merger:update', data, build_set)
+        self.submitJob('merger:update', data, build_set, precedence)
 
     def onBuildCompleted(self, job):
         build_set = self.build_sets.get(job.unique)
diff --git a/zuul/model.py b/zuul/model.py
index b85f3eb..67ce8be 100644
--- a/zuul/model.py
+++ b/zuul/model.py
@@ -77,7 +77,7 @@
         self.manager = None
         self.queues = []
         self.precedence = PRECEDENCE_NORMAL
-        self.trigger = None
+        self.source = None
         self.start_actions = None
         self.success_actions = None
         self.failure_actions = None
@@ -589,6 +589,12 @@
     PENDING = 2
     COMPLETE = 3
 
+    states_map = {
+        1: 'NEW',
+        2: 'PENDING',
+        3: 'COMPLETE',
+    }
+
     def __init__(self, item):
         self.item = item
         self.other_changes = []
@@ -603,6 +609,12 @@
         self.failing_reasons = []
         self.merge_state = self.NEW
 
+    def __repr__(self):
+        return '<BuildSet item: %s #builds: %s merge state: %s>' % (
+            self.item,
+            len(self.builds),
+            self.getStateName(self.merge_state))
+
     def setConfiguration(self):
         # The change isn't enqueued until after it's created
         # so we don't know what the other changes ahead will be
@@ -615,6 +627,10 @@
         if not self.ref:
             self.ref = 'Z' + uuid4().hex
 
+    def getStateName(self, state_num):
+        return self.states_map.get(
+            state_num, 'UNKNOWN (%s)' % state_num)
+
     def addBuild(self, build):
         self.builds[build.job.name] = build
         build.build_set = self
@@ -693,6 +709,10 @@
         ret['project'] = changeish.project.name
         ret['enqueue_time'] = int(self.enqueue_time * 1000)
         ret['jobs'] = []
+        if hasattr(changeish, 'owner'):
+            ret['owner'] = changeish.owner
+        else:
+            ret['owner'] = None
         max_remaining = 0
         for job in self.pipeline.getJobs(changeish):
             now = time.time()
@@ -841,6 +861,7 @@
         self.approvals = []
         self.open = None
         self.status = None
+        self.owner = None
 
     def _id(self):
         return '%s,%s' % (self.number, self.patchset)
@@ -947,6 +968,8 @@
         self.newrev = None
         # timer
         self.timespec = None
+        # zuultrigger
+        self.pipeline_name = None
         # For events that arrive with a destination pipeline (eg, from
         # an admin command, etc):
         self.forced_pipeline = None
@@ -965,20 +988,6 @@
 
         return ret
 
-    def getChange(self, project, trigger):
-        if self.change_number:
-            change = trigger.getChange(self.change_number, self.patch_number)
-        elif self.ref:
-            change = Ref(project)
-            change.ref = self.ref
-            change.oldrev = self.oldrev
-            change.newrev = self.newrev
-            change.url = trigger.getGitwebUrl(project, sha=self.newrev)
-        else:
-            change = NullChange(project)
-
-        return change
-
 
 class BaseFilter(object):
     def __init__(self, required_approvals=[]):
@@ -1038,23 +1047,26 @@
 
 
 class EventFilter(BaseFilter):
-    def __init__(self, types=[], branches=[], refs=[], event_approvals={},
-                 comments=[], emails=[], usernames=[], timespecs=[],
-                 required_approvals=[]):
+    def __init__(self, trigger, types=[], branches=[], refs=[],
+                 event_approvals={}, comments=[], emails=[], usernames=[],
+                 timespecs=[], required_approvals=[], pipelines=[]):
         super(EventFilter, self).__init__(
             required_approvals=required_approvals)
+        self.trigger = trigger
         self._types = types
         self._branches = branches
         self._refs = refs
         self._comments = comments
         self._emails = emails
         self._usernames = usernames
+        self._pipelines = pipelines
         self.types = [re.compile(x) for x in types]
         self.branches = [re.compile(x) for x in branches]
         self.refs = [re.compile(x) for x in refs]
         self.comments = [re.compile(x) for x in comments]
         self.emails = [re.compile(x) for x in emails]
         self.usernames = [re.compile(x) for x in usernames]
+        self.pipelines = [re.compile(x) for x in pipelines]
         self.event_approvals = event_approvals
         self.timespecs = timespecs
 
@@ -1063,6 +1075,8 @@
 
         if self._types:
             ret += ' types: %s' % ', '.join(self._types)
+        if self._pipelines:
+            ret += ' pipelines: %s' % ', '.join(self._pipelines)
         if self._branches:
             ret += ' branches: %s' % ', '.join(self._branches)
         if self._refs:
@@ -1094,6 +1108,14 @@
         if self.types and not matches_type:
             return False
 
+        # pipelines are ORed
+        matches_pipeline = False
+        for epipe in self.pipelines:
+            if epipe.match(event.pipeline_name):
+                matches_pipeline = True
+        if self.pipelines and not matches_pipeline:
+            return False
+
         # branches are ORed
         matches_branch = False
         for branch in self.branches:
@@ -1104,9 +1126,10 @@
 
         # refs are ORed
         matches_ref = False
-        for ref in self.refs:
-            if ref.match(event.ref):
-                matches_ref = True
+        if event.ref is not None:
+            for ref in self.refs:
+                if ref.match(event.ref):
+                    matches_ref = True
         if self.refs and not matches_ref:
             return False
 
diff --git a/zuul/rpclistener.py b/zuul/rpclistener.py
index fcf1161..05b8d03 100644
--- a/zuul/rpclistener.py
+++ b/zuul/rpclistener.py
@@ -109,7 +109,7 @@
         if not errors:
             event.change_number, event.patch_number = args['change'].split(',')
             try:
-                event.getChange(project, trigger)
+                pipeline.source.getChange(event, project)
             except Exception:
                 errors += 'Invalid change: %s\n' % (args['change'],)
 
diff --git a/zuul/scheduler.py b/zuul/scheduler.py
index b3163b3..9effcb8 100644
--- a/zuul/scheduler.py
+++ b/zuul/scheduler.py
@@ -1,4 +1,4 @@
-# Copyright 2012 Hewlett-Packard Development Company, L.P.
+# Copyright 2012-2014 Hewlett-Packard Development Company, L.P.
 # Copyright 2013 OpenStack Foundation
 # Copyright 2013 Antoine "hashar" Musso
 # Copyright 2013 Wikimedia Foundation Inc.
@@ -183,7 +183,6 @@
         self.triggers = dict()
         self.reporters = dict()
         self.config = None
-        self._maintain_trigger_cache = False
 
         self.trigger_event_queue = Queue.Queue()
         self.result_event_queue = Queue.Queue()
@@ -235,6 +234,8 @@
         for conf_pipeline in data.get('pipelines', []):
             pipeline = Pipeline(conf_pipeline['name'])
             pipeline.description = conf_pipeline.get('description')
+            # TODO(jeblair): remove backwards compatibility:
+            pipeline.source = self.triggers[conf_pipeline.get('source', 'gerrit')]
             precedence = model.PRECEDENCE_MAP[conf_pipeline.get('precedence')]
             pipeline.precedence = precedence
             pipeline.failure_message = conf_pipeline.get('failure-message',
@@ -298,7 +299,6 @@
             # TODO: move this into triggers (may require pluggable
             # configuration)
             if 'gerrit' in conf_pipeline['trigger']:
-                pipeline.trigger = self.triggers['gerrit']
                 for trigger in toList(conf_pipeline['trigger']['gerrit']):
                     approvals = {}
                     for approval_dict in toList(trigger.get('approval')):
@@ -314,7 +314,8 @@
                     usernames = toList(trigger.get('username'))
                     if not usernames:
                         usernames = toList(trigger.get('username_filter'))
-                    f = EventFilter(types=toList(trigger['event']),
+                    f = EventFilter(trigger=self.triggers['gerrit'],
+                                    types=toList(trigger['event']),
                                     branches=toList(trigger.get('branch')),
                                     refs=toList(trigger.get('ref')),
                                     event_approvals=approvals,
@@ -324,12 +325,20 @@
                                     required_approvals=
                                     toList(trigger.get('require-approval')))
                     manager.event_filters.append(f)
-            elif 'timer' in conf_pipeline['trigger']:
-                pipeline.trigger = self.triggers['timer']
+            if 'timer' in conf_pipeline['trigger']:
                 for trigger in toList(conf_pipeline['trigger']['timer']):
-                    f = EventFilter(types=['timer'],
+                    f = EventFilter(trigger=self.triggers['timer'],
+                                    types=['timer'],
                                     timespecs=toList(trigger['time']))
                     manager.event_filters.append(f)
+            if 'zuul' in conf_pipeline['trigger']:
+                for trigger in toList(conf_pipeline['trigger']['zuul']):
+                    f = EventFilter(trigger=self.triggers['zuul'],
+                                    types=toList(trigger['event']),
+                                    pipelines=toList(trigger.get('pipeline')),
+                                    required_approvals=
+                                    toList(trigger.get('require-approval')))
+                    manager.event_filters.append(f)
 
         for project_template in data.get('project-templates', []):
             # Make sure the template only contains valid pipelines
@@ -657,6 +666,7 @@
                             "Exception while canceling build %s "
                             "for change %s" % (build, item.change))
             self.layout = layout
+            self.maintainTriggerCache()
             for trigger in self.triggers.values():
                 trigger.postConfig()
             if statsd:
@@ -714,8 +724,7 @@
     def _doEnqueueEvent(self, event):
         project = self.layout.projects.get(event.project_name)
         pipeline = self.layout.pipelines[event.forced_pipeline]
-        trigger = self.triggers.get(event.trigger_name)
-        change = event.getChange(project, trigger)
+        change = pipeline.source.getChange(event, project)
         self.log.debug("Event %s for change %s was directly assigned "
                        "to pipeline %s" % (event, change, self))
         self.log.info("Adding %s, %s to %s" %
@@ -775,10 +784,6 @@
                     while pipeline.manager.processQueue():
                         pass
 
-                if self._maintain_trigger_cache:
-                    self.maintainTriggerCache()
-                    self._maintain_trigger_cache = False
-
             except Exception:
                 self.log.exception("Exception in run handler:")
                 # There may still be more events to process
@@ -809,8 +814,7 @@
                 return
 
             for pipeline in self.layout.pipelines.values():
-                change = event.getChange(project,
-                                         self.triggers.get(event.trigger_name))
+                change = pipeline.source.getChange(event, project)
                 if event.type == 'patchset-created':
                     pipeline.manager.removeOldVersionsOfChange(change)
                 elif event.type == 'change-abandoned':
@@ -944,6 +948,7 @@
 
     def _postConfig(self, layout):
         self.log.info("Configured Pipeline Manager %s" % self.pipeline.name)
+        self.log.info("  Source: %s" % self.pipeline.source)
         self.log.info("  Requirements:")
         for f in self.changeish_filters:
             self.log.info("    %s" % f)
@@ -1152,6 +1157,7 @@
                 item.enqueue_time = enqueue_time
             self.reportStats(item)
             self.enqueueChangesBehind(change, quiet, ignore_requirements)
+            self.sched.triggers['zuul'].onChangeEnqueued(item.change, self.pipeline)
         else:
             self.log.error("Unable to find change queue for project %s" %
                            change.project)
@@ -1161,7 +1167,6 @@
         self.log.debug("Removing change %s from queue" % item.change)
         change_queue = self.pipeline.getQueue(item.change.project)
         change_queue.dequeueItem(item)
-        self.sched._maintain_trigger_cache = True
 
     def removeChange(self, change):
         # Remove a change from the queue, probably because it has been
@@ -1188,7 +1193,7 @@
             oldrev = item.change.oldrev
             newrev = item.change.newrev
         return dict(project=item.change.project.name,
-                    url=self.pipeline.trigger.getGitUrl(
+                    url=self.pipeline.source.getGitUrl(
                         item.change.project),
                     merge_mode=item.change.project.merge_mode,
                     refspec=item.change.refspec,
@@ -1217,12 +1222,14 @@
             all_items = dependent_items + [item]
             merger_items = map(self._makeMergerItem, all_items)
             self.sched.merger.mergeChanges(merger_items,
-                                           item.current_build_set)
+                                           item.current_build_set,
+                                           self.pipeline.precedence)
         else:
             self.log.debug("Preparing update repo for: %s" % item.change)
-            url = self.pipeline.trigger.getGitUrl(item.change.project)
+            url = self.pipeline.source.getGitUrl(item.change.project)
             self.sched.merger.updateRepo(item.change.project.name,
-                                         url, build_set)
+                                         url, build_set,
+                                         self.pipeline.precedence)
         return False
 
     def _launchJobs(self, item, jobs):
@@ -1410,8 +1417,8 @@
             succeeded = self.pipeline.didAllJobsSucceed(item)
             merged = item.reported
             if merged:
-                merged = self.pipeline.trigger.isMerged(item.change,
-                                                        item.change.branch)
+                merged = self.pipeline.source.isMerged(item.change,
+                                                       item.change.branch)
             self.log.info("Reported change %s status: all-succeeded: %s, "
                           "merged: %s" % (item.change, succeeded, merged))
             change_queue = self.pipeline.getQueue(item.change.project)
@@ -1426,6 +1433,7 @@
                 change_queue.increaseWindowSize()
                 self.log.debug("%s window size increased to %s" %
                                (change_queue, change_queue.window))
+                self.sched.triggers['zuul'].onChangeMerged(item.change)
 
     def _reportItem(self, item):
         self.log.debug("Reporting change %s" % item.change)
@@ -1737,8 +1745,8 @@
         return new_change_queues
 
     def isChangeReadyToBeEnqueued(self, change):
-        if not self.pipeline.trigger.canMerge(change,
-                                              self.getSubmitAllowNeeds()):
+        if not self.pipeline.source.canMerge(change,
+                                             self.getSubmitAllowNeeds()):
             self.log.debug("Change %s can not merge, ignoring" % change)
             return False
         return True
@@ -1750,8 +1758,8 @@
             self.log.debug("  Changeish does not support dependencies")
             return
         for needs in change.needed_by_changes:
-            if self.pipeline.trigger.canMerge(needs,
-                                              self.getSubmitAllowNeeds()):
+            if self.pipeline.source.canMerge(needs,
+                                             self.getSubmitAllowNeeds()):
                 self.log.debug("  Change %s needs %s and is ready to merge" %
                                (needs, change))
                 to_enqueue.append(needs)
@@ -1790,8 +1798,8 @@
         if self.isChangeAlreadyInQueue(change.needs_change):
             self.log.debug("  Needed change is already ahead in the queue")
             return True
-        if self.pipeline.trigger.canMerge(change.needs_change,
-                                          self.getSubmitAllowNeeds()):
+        if self.pipeline.source.canMerge(change.needs_change,
+                                         self.getSubmitAllowNeeds()):
             self.log.debug("  Change %s is needed" %
                            change.needs_change)
             return change.needs_change
diff --git a/zuul/trigger/gerrit.py b/zuul/trigger/gerrit.py
index 4840f5a..368e37d 100644
--- a/zuul/trigger/gerrit.py
+++ b/zuul/trigger/gerrit.py
@@ -18,7 +18,7 @@
 import urllib2
 import voluptuous
 from zuul.lib import gerrit
-from zuul.model import TriggerEvent, Change
+from zuul.model import TriggerEvent, Change, Ref, NullChange
 
 
 class GerritEventConnector(threading.Thread):
@@ -85,12 +85,12 @@
             event.account = None
 
         if event.change_number:
-            # Call getChange for the side effect of updating the
+            # Call _getChange for the side effect of updating the
             # cache.  Note that this modifies Change objects outside
             # the main thread.
-            self.trigger.getChange(event.change_number,
-                                   event.patch_number,
-                                   refresh=True)
+            self.trigger._getChange(event.change_number,
+                                    event.patch_number,
+                                    refresh=True)
 
         self.sched.addEvent(event)
         self.gerrit.eventDone()
@@ -280,7 +280,7 @@
     def maintainCache(self, relevant):
         # This lets the user supply a list of change objects that are
         # still in use.  Anything in our cache that isn't in the supplied
-        # list should be same to remove from the cache.
+        # list should be safe to remove from the cache.
         remove = []
         for key, change in self._change_cache.items():
             if change not in relevant:
@@ -291,7 +291,20 @@
     def postConfig(self):
         pass
 
-    def getChange(self, number, patchset, refresh=False):
+    def getChange(self, event, project):
+        if event.change_number:
+            change = self._getChange(event.change_number, event.patch_number)
+        elif event.ref:
+            change = Ref(project)
+            change.ref = event.ref
+            change.oldrev = event.oldrev
+            change.newrev = event.newrev
+            change.url = self.getGitwebUrl(project, sha=event.newrev)
+        else:
+            change = NullChange(project)
+        return change
+
+    def _getChange(self, number, patchset, refresh=False):
         key = '%s,%s' % (number, patchset)
         change = None
         if key in self._change_cache:
@@ -311,6 +324,21 @@
             raise
         return change
 
+    def getProjectOpenChanges(self, project):
+        # This is a best-effort function in case Gerrit is unable to return
+        # a particular change.  It happens.
+        query = "project:%s status:open" % (project.name,)
+        self.log.debug("Running query %s to get project open changes" % (query,))
+        data = self.gerrit.simpleQuery(query)
+        changes = []
+        for record in data[:-1]:
+            try:
+                changes.append(self._getChange(record['number'],
+                                               record['currentPatchSet']['number']))
+            except Exception:
+                self.log.exception("Unable to query change %s" % (record.get('number'),))
+        return changes
+
     def updateChange(self, change):
         self.log.info("Updating information for %s,%s" %
                       (change.number, change.patchset))
@@ -341,6 +369,11 @@
             change.is_current_patchset = False
 
         change.is_merged = self._isMerged(change)
+        change.approvals = data['currentPatchSet'].get('approvals', [])
+        change.open = data['open']
+        change.status = data['status']
+        change.owner = data['owner']
+
         if change.is_merged:
             # This change is merged, so we don't need to look any further
             # for dependencies.
@@ -350,7 +383,7 @@
         if 'dependsOn' in data:
             parts = data['dependsOn'][0]['ref'].split('/')
             dep_num, dep_ps = parts[3], parts[4]
-            dep = self.getChange(dep_num, dep_ps)
+            dep = self._getChange(dep_num, dep_ps)
             if not dep.is_merged:
                 change.needs_change = dep
 
@@ -359,14 +392,10 @@
             for needed in data['neededBy']:
                 parts = needed['ref'].split('/')
                 dep_num, dep_ps = parts[3], parts[4]
-                dep = self.getChange(dep_num, dep_ps)
+                dep = self._getChange(dep_num, dep_ps)
                 if not dep.is_merged and dep.is_current_patchset:
                     change.needed_by_changes.append(dep)
 
-        change.approvals = data['currentPatchSet'].get('approvals', [])
-        change.open = data['open']
-        change.status = data['status']
-
         return change
 
     def getGitUrl(self, project):
diff --git a/zuul/trigger/timer.py b/zuul/trigger/timer.py
index 904fa7a..3d5cd9b 100644
--- a/zuul/trigger/timer.py
+++ b/zuul/trigger/timer.py
@@ -56,9 +56,9 @@
         for job in self.apsched.get_jobs():
             self.apsched.unschedule_job(job)
         for pipeline in self.sched.layout.pipelines.values():
-            if pipeline.trigger != self:
-                continue
             for ef in pipeline.manager.event_filters:
+                if ef.trigger != self:
+                    continue
                 for timespec in ef.timespecs:
                     parts = timespec.split()
                     if len(parts) < 5 or len(parts) > 6:
@@ -82,15 +82,11 @@
                                               args=(pipeline.name,
                                                     timespec,))
 
-    def getChange(self, number, patchset, refresh=False):
+    def getChange(self, event, project):
         raise Exception("Timer trigger does not support changes.")
 
     def getGitUrl(self, project):
-        # For the moment, the timer trigger requires gerrit.
-        return self.sched.triggers['gerrit'].getGitUrl(project)
+        raise Exception("Timer trigger does not support changes.")
 
     def getGitwebUrl(self, project, sha=None):
-        url = '%s/gitweb?p=%s.git' % (self.baseurl, project)
-        if sha:
-            url += ';a=commitdiff;h=' + sha
-        return url
+        raise Exception("Timer trigger does not support changes.")
diff --git a/zuul/trigger/zuultrigger.py b/zuul/trigger/zuultrigger.py
new file mode 100644
index 0000000..27098ab
--- /dev/null
+++ b/zuul/trigger/zuultrigger.py
@@ -0,0 +1,119 @@
+# Copyright 2012-2014 Hewlett-Packard Development Company, L.P.
+# Copyright 2013 OpenStack Foundation
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import logging
+from zuul.model import TriggerEvent
+
+
+class ZuulTrigger(object):
+    name = 'zuul'
+    log = logging.getLogger("zuul.ZuulTrigger")
+
+    def __init__(self, config, sched):
+        self.sched = sched
+        self.config = config
+        self._handle_parent_change_enqueued_events = False
+        self._handle_project_change_merged_events = False
+
+    def stop(self):
+        pass
+
+    def isMerged(self, change, head=None):
+        raise Exception("Zuul trigger does not support checking if "
+                        "a change is merged.")
+
+    def canMerge(self, change, allow_needs):
+        raise Exception("Zuul trigger does not support checking if "
+                        "a change can merge.")
+
+    def maintainCache(self, relevant):
+        return
+
+    def onChangeMerged(self, change):
+        # Called each time zuul merges a change
+        if self._handle_project_change_merged_events:
+            try:
+                self._createProjectChangeMergedEvents(change)
+            except Exception:
+                self.log.exception("Unable to create project-change-merged events for %s" %
+                                   (change,))
+
+    def onChangeEnqueued(self, change, pipeline):
+        # Called each time a change is enqueued in a pipeline
+        if self._handle_parent_change_enqueued_events:
+            try:
+                self._createParentChangeEnqueuedEvents(change, pipeline)
+            except Exception:
+                self.log.exception("Unable to create parent-change-enqueued events for %s in %s" %
+                                   (change, pipeline))
+
+    def _createProjectChangeMergedEvents(self, change):
+        changes = self.sched.triggers['gerrit'].getProjectOpenChanges(change.project)
+        for open_change in changes:
+            self._createProjectChangeMergedEvent(open_change)
+
+    def _createProjectChangeMergedEvent(self, change):
+        event = TriggerEvent()
+        event.type = 'project-change-merged'
+        event.trigger_name = self.name
+        event.project_name = change.project.name
+        event.change_number = change.number
+        event.branch = change.branch
+        event.change_url = change.url
+        event.patch_number = change.patchset
+        event.refspec = change.refspec
+        self.sched.addEvent(event)
+
+    def _createParentChangeEnqueuedEvents(self, change, pipeline):
+        self.log.debug("Checking for changes needing %s:" % change)
+        if not hasattr(change, 'needed_by_changes'):
+            self.log.debug("  Changeish does not support dependencies")
+            return
+        for needs in change.needed_by_changes:
+            self._createParentChangeEnqueuedEvent(needs, pipeline)
+
+    def _createParentChangeEnqueuedEvent(self, change, pipeline):
+        event = TriggerEvent()
+        event.type = 'parent-change-enqueued'
+        event.trigger_name = self.name
+        event.pipeline_name = pipeline.name
+        event.project_name = change.project.name
+        event.change_number = change.number
+        event.branch = change.branch
+        event.change_url = change.url
+        event.patch_number = change.patchset
+        event.refspec = change.refspec
+        self.sched.addEvent(event)
+
+    def postConfig(self):
+        self._handle_parent_change_enqueued_events = False
+        self._handle_project_change_merged_events = False
+        for pipeline in self.sched.layout.pipelines.values():
+            for ef in pipeline.manager.event_filters:
+                if ef.trigger != self:
+                    continue
+                if 'parent-change-enqueued' in ef._types:
+                    self._handle_parent_change_enqueued_events = True
+                elif 'project-change-merged' in ef._types:
+                    self._handle_project_change_merged_events = True
+
+    def getChange(self, number, patchset, refresh=False):
+        raise Exception("Zuul trigger does not support changes.")
+
+    def getGitUrl(self, project):
+        raise Exception("Zuul trigger does not support changes.")
+
+    def getGitwebUrl(self, project, sha=None):
+        raise Exception("Zuul trigger does not support changes.")
diff --git a/zuul/webapp.py b/zuul/webapp.py
index 4d6115f..e289398 100644
--- a/zuul/webapp.py
+++ b/zuul/webapp.py
@@ -13,13 +13,32 @@
 # License for the specific language governing permissions and limitations
 # under the License.
 
+import copy
+import json
 import logging
+import re
 import threading
 import time
 from paste import httpserver
 import webob
 from webob import dec
 
+"""Zuul main web app.
+
+Zuul supports HTTP requests directly against it for determining the
+change status. These responses are provided as json data structures.
+
+The supported urls are:
+
+ - /status: return a complex data structure that represents the entire
+   queue / pipeline structure of the system
+ - /status.json (backwards compatibility): same as /status
+ - /status/change/X,Y: return status just for gerrit change X,Y
+
+When returning status for a single gerrit change you will get an
+array of changes, they will not include the queue structure.
+"""
+
 
 class WebApp(threading.Thread):
     log = logging.getLogger("zuul.WebApp")
@@ -41,9 +60,44 @@
     def stop(self):
         self.server.server_close()
 
+    def _changes_by_func(self, func):
+        """Filter changes by a user provided function.
+
+        In order to support arbitrary collection of subsets of changes
+        we provide a low level filtering mechanism that takes a
+        function which applies to changes. The output of this function
+        is a flattened list of those collected changes.
+        """
+        status = []
+        jsonstruct = json.loads(self.cache)
+        for pipeline in jsonstruct['pipelines']:
+            for change_queue in pipeline['change_queues']:
+                for head in change_queue['heads']:
+                    for change in head:
+                        if func(change):
+                            status.append(copy.deepcopy(change))
+        return json.dumps(status)
+
+    def _status_for_change(self, rev):
+        """Return the statuses for a particular change id X,Y."""
+        def func(change):
+            return change['id'] == rev
+        return self._changes_by_func(func)
+
+    def _normalize_path(self, path):
+        # support legacy status.json as well as new /status
+        if path == '/status.json' or path == '/status':
+            return "status"
+        m = re.match('/status/change/(\d+,\d+)$', path)
+        if m:
+            return m.group(1)
+        return None
+
     def app(self, request):
-        if request.path != '/status.json':
+        path = self._normalize_path(request.path)
+        if path is None:
             raise webob.exc.HTTPNotFound()
+
         if (not self.cache or
             (time.time() - self.cache_time) > self.cache_expiry):
             try:
@@ -54,8 +108,18 @@
             except:
                 self.log.exception("Exception formatting status:")
                 raise
-        response = webob.Response(body=self.cache,
-                                  content_type='application/json')
+
+        if path == 'status':
+            response = webob.Response(body=self.cache,
+                                      content_type='application/json')
+        else:
+            status = self._status_for_change(path)
+            if status:
+                response = webob.Response(body=status,
+                                          content_type='application/json')
+            else:
+                raise webob.exc.HTTPNotFound()
+
         response.headers['Access-Control-Allow-Origin'] = '*'
         response.last_modified = self.cache_time
         return response