Merge "Make ZuulDaemonApp an abstract base class" into feature/zuulv3
diff --git a/.zuul.yaml b/.zuul.yaml
index a87c196..7473ad3 100644
--- a/.zuul.yaml
+++ b/.zuul.yaml
@@ -65,7 +65,5 @@
         - zuul-stream-functional
     post:
       jobs:
-        - publish-openstack-sphinx-docs-infra:
-            vars:
-              sphinx_python: python3
+        - publish-openstack-sphinx-docs-infra-python3
         - publish-openstack-python-branch-tarball
diff --git a/README.rst b/README.rst
index 52b89df..8d00665 100644
--- a/README.rst
+++ b/README.rst
@@ -10,6 +10,14 @@
 The latest documentation for Zuul v3 is published at:
 https://docs.openstack.org/infra/zuul/feature/zuulv3/
 
+If you are looking for the Edge routing service named Zuul that is
+related to Netflix, it can be found here:
+https://github.com/Netflix/zuul
+
+If you are looking for the Javascript testing tool named Zuul, it
+can be found here:
+https://github.com/defunctzombie/zuul
+
 Contributing
 ------------
 
diff --git a/bindep.txt b/bindep.txt
index 85254b4..3dcc3e7 100644
--- a/bindep.txt
+++ b/bindep.txt
@@ -8,7 +8,7 @@
 zookeeperd [platform:dpkg]
 build-essential [platform:dpkg]
 gcc [platform:rpm]
-graphviz [test]
+graphviz [doc]
 libssl-dev [platform:dpkg]
 openssl-devel [platform:rpm]
 libffi-dev [platform:dpkg]
diff --git a/doc/source/admin/components.rst b/doc/source/admin/components.rst
index 3bec28a..d6b0984 100644
--- a/doc/source/admin/components.rst
+++ b/doc/source/admin/components.rst
@@ -8,7 +8,6 @@
 Zuul is a distributed system consisting of several components, each of
 which is described below.
 
-
 .. graphviz::
    :align: center
 
@@ -31,7 +30,27 @@
       Scheduler -- GitHub;
    }
 
+Each of the Zuul processes may run on the same host, or different
+hosts.  Within Zuul, the components communicate with the scheduler via
+the Gearman protocol, so each Zuul component needs to be able to
+connect to the host running the Gearman server (the scheduler has a
+built-in Gearman server which is recommended) on the Gearman port --
+TCP port 4730 by default.
 
+The Zuul scheduler communicates with Nodepool via the ZooKeeper
+protocol.  Nodepool requires an external ZooKeeper cluster, and the
+Zuul scheduler needs to be able to connect to the hosts in that
+cluster on TCP port 2181.
+
+Both the Nodepool launchers and Zuul executors need to be able to
+communicate with the hosts which nodepool provides.  If these are on
+private networks, the Executors will need to be able to route traffic
+to them.
+
+If statsd is enabled, every service needs to be able to emit data to
+statsd.  Statsd can be configured to run on each host and forward
+data, or services may emit to a centralized statsd collector.  Statsd
+listens on UDP port 8125 by default.
 
 All Zuul processes read the ``/etc/zuul/zuul.conf`` file (an alternate
 location may be supplied on the command line) which uses an INI file
@@ -154,6 +173,23 @@
 items into pipelines, distributes jobs to executors, and reports
 results.
 
+The scheduler includes a Gearman server which is used to communicate
+with other components of Zuul.  It is possible to use an external
+Gearman server, but the built-in server is well-tested and
+recommended.  If the built-in server is used, other Zuul hosts will
+need to be able to connect to the scheduler on the Gearman port, TCP
+port 4730.  It is also strongly recommended to use SSL certs with
+Gearman, as secrets are transferred from the scheduler to executors
+over this link.
+
+The scheduler must be able to connect to the ZooKeeper cluster used by
+Nodepool in order to request nodes.  It does not need to connect
+directly to the nodes themselves, however -- that function is handled
+by the Executors.
+
+It must also be able to connect to any services for which connections
+are configured (Gerrit, GitHub, etc).
+
 Configuration
 ~~~~~~~~~~~~~
 
@@ -280,6 +316,10 @@
 Therefore, administrators may wish to run standalone mergers in order
 to reduce the load on executors.
 
+Mergers need to be able to connect to the Gearman server (usually the
+scheduler host) as well as any services for which connections are
+configured (Gerrit, GitHub, etc).
+
 Configuration
 ~~~~~~~~~~~~~
 
@@ -358,6 +398,11 @@
 the executor performs both roles, small Zuul installations may not
 need to run standalone mergers.
 
+Executors need to be able to connect to the Gearman server (usually
+the scheduler host), any services for which connections are configured
+(Gerrit, GitHub, etc), as well as directly to the hosts which Nodepool
+provides.
+
 Trusted and Untrusted Playbooks
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 
@@ -408,7 +453,7 @@
       Path to command socket file for the executor process.
 
    .. attr:: finger_port
-      :default: 79
+      :default: 7900
 
       Port to use for finger log streamer.
 
@@ -451,13 +496,6 @@
 
       SSH private key file to be used when logging into worker nodes.
 
-   .. attr:: user
-      :default: zuul
-
-      User ID for the zuul-executor process. In normal operation as a
-      daemon, the executor should be started as the ``root`` user, but
-      it will drop privileges to this user during startup.
-
    .. _admin_sitewide_variables:
 
    .. attr:: variables
@@ -515,7 +553,7 @@
       significant protections against malicious users and accidental
       breakage in playbooks. As such,  `nullwrap` is not recommended
       for use in production.
-      
+
       This option, and thus, `nullwrap`, may be removed in the future.
       `bubblewrap` has become integral to securely operating Zuul.  If you
       have a valid use case for it, we encourage you to let us know.
@@ -584,6 +622,12 @@
 streaming. Eventually, it will serve as the single process handling all
 HTTP interactions with Zuul.
 
+Web servers need to be able to connect to the Gearman server (usually
+the scheduler host).  If the SQL reporter is used, they need to be
+able to connect to the database it reports to in order to support the
+dashboard.  If a GitHub connection is configured, they need to be
+reachable by GitHub so they may receive notifications.
+
 Configuration
 ~~~~~~~~~~~~~
 
@@ -643,6 +687,10 @@
 
 The above would stream the logs for the build identified by `UUID`.
 
+Finger gateway servers need to be able to connect to the Gearman
+server (usually the scheduler host), as well as the console streaming
+port on the executors (usually 7900).
+
 Configuration
 ~~~~~~~~~~~~~
 
diff --git a/doc/source/admin/connections.rst b/doc/source/admin/connections.rst
index 29ca3be..b04dbb0 100644
--- a/doc/source/admin/connections.rst
+++ b/doc/source/admin/connections.rst
@@ -33,6 +33,15 @@
   driver=gerrit
   server=review.example.com
 
+Zuul needs to use a single connection to look up information about
+changes hosted by a given system.  When it looks up changes, it will
+do so using the first connection it finds that matches the server name
+it's looking for.  It's generally best to use only a single connection
+for a given server, however, if you need more than one (for example,
+to satisfy unique reporting requirements) be sure to list the primary
+connection first as that is what Zuul will use to look up all changes
+for that server.
+
 .. _drivers:
 
 Drivers
@@ -55,6 +64,7 @@
 
    drivers/gerrit
    drivers/github
+   drivers/git
    drivers/smtp
    drivers/sql
    drivers/timer
diff --git a/doc/source/admin/drivers/git.rst b/doc/source/admin/drivers/git.rst
new file mode 100644
index 0000000..e0acec1
--- /dev/null
+++ b/doc/source/admin/drivers/git.rst
@@ -0,0 +1,59 @@
+:title: Git Driver
+
+Git
+===
+
+This driver can be used to load Zuul configuration from public Git repositories,
+for instance from ``openstack-infra/zuul-jobs`` that is suitable for use by
+any Zuul system. It can also be used to trigger jobs from ``ref-updated`` events
+in a pipeline.
+
+Connection Configuration
+------------------------
+
+The supported options in ``zuul.conf`` connections are:
+
+.. attr:: <git connection>
+
+   .. attr:: driver
+      :required:
+
+      .. value:: git
+
+         The connection must set ``driver=git`` for Git connections.
+
+   .. attr:: baseurl
+
+      Path to the base Git URL. Git repos name will be appended to it.
+
+   .. attr:: poll_delay
+      :default: 7200
+
+      The delay in seconds of the Git repositories polling loop.
+
+Trigger Configuration
+---------------------
+
+.. attr:: pipeline.trigger.<git source>
+
+   The dictionary passed to the Git pipeline ``trigger`` attribute
+   supports the following attributes:
+
+   .. attr:: event
+      :required:
+
+      Only ``ref-updated`` is supported.
+
+   .. attr:: ref
+
+      On ref-updated events, a ref such as ``refs/heads/master`` or
+      ``^refs/tags/.*$``. This field is treated as a regular expression,
+      and multiple refs may be listed.
+
+   .. attr:: ignore-deletes
+      :default: true
+
+      When a ref is deleted, a ref-updated event is emitted with a
+      newrev of all zeros specified. The ``ignore-deletes`` field is a
+      boolean value that describes whether or not these newrevs
+      trigger ref-updated events.
diff --git a/doc/source/admin/drivers/zuul.rst b/doc/source/admin/drivers/zuul.rst
index d95dffc..41535ee 100644
--- a/doc/source/admin/drivers/zuul.rst
+++ b/doc/source/admin/drivers/zuul.rst
@@ -26,6 +26,12 @@
          When Zuul merges a change to a project, it generates this
          event for every open change in the project.
 
+         .. warning::
+
+            Triggering on this event can cause poor performance when
+            using the GitHub driver with a large number of
+            installations.
+
       .. value:: parent-change-enqueued
 
          When Zuul enqueues a change into any pipeline, it generates
diff --git a/doc/source/index.rst b/doc/source/index.rst
index 677e958..6e1b52e 100644
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -12,6 +12,14 @@
 :doc:`admin/index` useful.  If you want help make Zuul itself better,
 take a look at the :doc:`developer/index`.
 
+If you are looking for the Edge routing service named Zuul that is
+related to Netflix, it can be found here:
+https://github.com/Netflix/zuul
+
+If you are looking for the Javascript testing tool named Zuul, it
+can be found here:
+https://github.com/defunctzombie/zuul
+
 Contents:
 
 .. toctree::
diff --git a/doc/source/user/config.rst b/doc/source/user/config.rst
index 916e66a..525cb38 100644
--- a/doc/source/user/config.rst
+++ b/doc/source/user/config.rst
@@ -539,6 +539,13 @@
       specified in a project's pipeline, set this attribute to
       ``true``.
 
+   .. attr:: protected
+      :default: false
+
+      When set to ``true`` only jobs defined in the same project may inherit
+      from this job. Once this is set to ``true`` it cannot be reset to
+      ``false``.
+
    .. attr:: success-message
       :default: SUCCESS
 
@@ -1032,11 +1039,12 @@
    The following attributes may appear in a project:
 
    .. attr:: name
-      :required:
 
       The name of the project.  If Zuul is configured with two or more
       unique projects with the same name, the canonical hostname for
       the project should be included (e.g., `git.example.com/foo`).
+      If not given it is implicitly derived from the project where this
+      is defined.
 
    .. attr:: templates
 
diff --git a/doc/source/user/gating.rst b/doc/source/user/gating.rst
index 795df72..543a8cc 100644
--- a/doc/source/user/gating.rst
+++ b/doc/source/user/gating.rst
@@ -246,11 +246,25 @@
 between changes in different git repositories.  Change A may depend on
 B, but B may not depend on A.
 
-.. TODO: update for v3 crd syntax
+To use them, include ``Depends-On: <change-url>`` in the footer of a
+commit message.  For example, a change which depends on a GitHub pull
+request (PR #4) might have the following footer::
 
-To use them, include ``Depends-On: <gerrit-change-id>`` in the footer of
-a commit message.  Use the full Change-ID ('I' + 40 characters).
+  Depends-On: https://github.com/example/test/pull/4
 
+And a change which depends on a Gerrit change (change number 3)::
+
+  Depends-On: https://review.example.com/3
+
+Changes may depend on changes in any other project, even projects not
+on the same system (i.e., a Gerrit change may depend on a GitHub pull
+request).
+
+.. note::
+
+   An older syntax of specifying dependencies using Gerrit change-ids
+   is still supported, however it is deprecated and will be removed in
+   a future version.
 
 Dependent Pipeline
 ~~~~~~~~~~~~~~~~~~
@@ -277,7 +291,7 @@
     B_status [ class = greendot ]
     B_status -- A_status
 
-    'Change B\nChange-Id: Iabc' <- 'Change A\nDepends-On: Iabc'
+    'Change B\nURL: .../4' <- 'Change A\nDepends-On: .../4'
   }
 
 If tests for B fail, both B and A will be removed from the pipeline, and
@@ -328,7 +342,7 @@
     B_status [class = "dot", color = grey]
     B_status -- A_status
 
-    "Change B" <- "Change A\nDepends-On: B"
+    "Change B\nURL: .../4" <- "Change A\nDepends-On: .../4"
   }
 
 This is to indicate that the grey changes are only there to establish
@@ -337,56 +351,13 @@
 additionally will appear as its own red or green dot for its test.
 
 
-.. TODO: relevant for v3?
-
 Multiple Changes
 ~~~~~~~~~~~~~~~~
 
-A Gerrit change ID may refer to multiple changes (on multiple branches
-of the same project, or even multiple projects).  In these cases, Zuul
-will treat all of the changes with that change ID as dependencies.  So
-if you say that change in project A Depends-On a change ID that has
-changes in two branches of project B, then when testing the change to
-project A, both project B changes will be applied, and when deciding
-whether the project A change can merge, both changes must merge ahead
-of it.
-
-.. blockdiag::
-  :align: center
-
-  blockdiag crdmultirepos {
-    orientation = portrait
-    span_width = 30
-    class greendot [
-        label = "",
-        shape = circle,
-        color = green,
-        width = 20, height = 20
-    ]
-
-    B_stable_status [ class = "greendot" ]
-    B_master_status [ class = "greendot" ]
-    A_status [ class = "greendot" ]
-    B_stable_status -- B_master_status -- A_status
-
-    A [ label = "Repo A\nDepends-On: I123" ]
-    group {
-        orientation = portrait
-        label = "Dependencies"
-        color = "lightgray"
-
-        B_stable [ label = "Repo B\nChange-Id: I123\nBranch: stable" ]
-        B_master [ label = "Repo B\nChange-Id: I123\nBranch: master" ]
-    }
-    B_master <- A
-    B_stable <- A
-
-  }
-
-A change may depend on more than one Gerrit change ID as well.  So it
-is possible for a change in project A to depend on a change in project
-B and a change in project C.  Simply add more ``Depends-On:`` lines to
-the commit message footer.
+A change may list more than one dependency by simply adding more
+``Depends-On:`` lines to the commit message footer.  It is possible
+for a change in project A to depend on a change in project B and a
+change in project C.
 
 .. blockdiag::
   :align: center
@@ -406,20 +377,18 @@
     A_status [ class = "greendot" ]
     C_status -- B_status -- A_status
 
-    A [ label = "Repo A\nDepends-On: I123\nDepends-On: Iabc" ]
+    A [ label = "Repo A\nDepends-On: .../3\nDepends-On: .../4" ]
     group {
         orientation = portrait
         label = "Dependencies"
         color = "lightgray"
 
-        B [ label = "Repo B\nChange-Id: I123" ]
-        C [ label = "Repo C\nChange-Id: Iabc" ]
+        B [ label = "Repo B\nURL: .../3" ]
+        C [ label = "Repo C\nURL: .../4" ]
     }
     B, C <- A
   }
 
-.. TODO: update for v3
-
 Cycles
 ~~~~~~
 
diff --git a/tests/base.py b/tests/base.py
index 69d9f55..c449242 100755
--- a/tests/base.py
+++ b/tests/base.py
@@ -40,7 +40,6 @@
 import uuid
 import urllib
 
-
 import git
 import gear
 import fixtures
@@ -53,6 +52,7 @@
 from git.exc import NoSuchPathError
 import yaml
 
+import tests.fakegithub
 import zuul.driver.gerrit.gerritsource as gerritsource
 import zuul.driver.gerrit.gerritconnection as gerritconnection
 import zuul.driver.github.githubconnection as githubconnection
@@ -170,7 +170,7 @@
             'status': status,
             'subject': subject,
             'submitRecords': [],
-            'url': 'https://hostname/%s' % number}
+            'url': 'https://%s/%s' % (self.gerrit.server, number)}
 
         self.upstream_root = upstream_root
         self.addPatchset(files=files, parent=parent)
@@ -559,14 +559,13 @@
             return change.query()
         return {}
 
-    def simpleQuery(self, query):
-        self.log.debug("simpleQuery: %s" % query)
-        self.queries.append(query)
+    def _simpleQuery(self, query):
         if query.startswith('change:'):
             # Query a specific changeid
             changeid = query[len('change:'):]
             l = [change.query() for change in self.changes.values()
-                 if change.data['id'] == changeid]
+                 if (change.data['id'] == changeid or
+                     change.data['number'] == changeid)]
         elif query.startswith('message:'):
             # Query the content of a commit message
             msg = query[len('message:'):].strip()
@@ -577,6 +576,20 @@
             l = [change.query() for change in self.changes.values()]
         return l
 
+    def simpleQuery(self, query):
+        self.log.debug("simpleQuery: %s" % query)
+        self.queries.append(query)
+        results = []
+        if query.startswith('(') and 'OR' in query:
+            query = query[1:-2]
+            for q in query.split(' OR '):
+                for r in self._simpleQuery(q):
+                    if r not in results:
+                        results.append(r)
+        else:
+            results = self._simpleQuery(query)
+        return results
+
     def _start_watcher_thread(self, *args, **kw):
         pass
 
@@ -601,98 +614,6 @@
     _points_to_commits_only = True
 
 
-class FakeGithub(object):
-
-    class FakeUser(object):
-        def __init__(self, login):
-            self.login = login
-            self.name = "Github User"
-            self.email = "github.user@example.com"
-
-    class FakeBranch(object):
-        def __init__(self, branch='master'):
-            self.name = branch
-
-    class FakeStatus(object):
-        def __init__(self, state, url, description, context, user):
-            self._state = state
-            self._url = url
-            self._description = description
-            self._context = context
-            self._user = user
-
-        def as_dict(self):
-            return {
-                'state': self._state,
-                'url': self._url,
-                'description': self._description,
-                'context': self._context,
-                'creator': {
-                    'login': self._user
-                }
-            }
-
-    class FakeCommit(object):
-        def __init__(self):
-            self._statuses = []
-
-        def set_status(self, state, url, description, context, user):
-            status = FakeGithub.FakeStatus(
-                state, url, description, context, user)
-            # always insert a status to the front of the list, to represent
-            # the last status provided for a commit.
-            self._statuses.insert(0, status)
-
-        def statuses(self):
-            return self._statuses
-
-    class FakeRepository(object):
-        def __init__(self):
-            self._branches = [FakeGithub.FakeBranch()]
-            self._commits = {}
-
-        def branches(self, protected=False):
-            if protected:
-                # simulate there is no protected branch
-                return []
-            return self._branches
-
-        def create_status(self, sha, state, url, description, context,
-                          user='zuul'):
-            # Since we're bypassing github API, which would require a user, we
-            # default the user as 'zuul' here.
-            commit = self._commits.get(sha, None)
-            if commit is None:
-                commit = FakeGithub.FakeCommit()
-                self._commits[sha] = commit
-            commit.set_status(state, url, description, context, user)
-
-        def commit(self, sha):
-            commit = self._commits.get(sha, None)
-            if commit is None:
-                commit = FakeGithub.FakeCommit()
-                self._commits[sha] = commit
-            return commit
-
-    def __init__(self):
-        self._repos = {}
-
-    def user(self, login):
-        return self.FakeUser(login)
-
-    def repository(self, owner, proj):
-        return self._repos.get((owner, proj), None)
-
-    def repo_from_project(self, project):
-        # This is a convenience method for the tests.
-        owner, proj = project.split('/')
-        return self.repository(owner, proj)
-
-    def addProject(self, project):
-        owner, proj = project.name.split('/')
-        self._repos[(owner, proj)] = self.FakeRepository()
-
-
 class FakeGithubPullRequest(object):
 
     def __init__(self, github, number, project, branch,
@@ -720,6 +641,7 @@
         self.is_merged = False
         self.merge_message = None
         self.state = 'open'
+        self.url = 'https://%s/%s/pull/%s' % (github.server, project, number)
         self._createPRRef()
         self._addCommitToRepo(files=files)
         self._updateTimeStamp()
@@ -1018,18 +940,18 @@
     log = logging.getLogger("zuul.test.FakeGithubConnection")
 
     def __init__(self, driver, connection_name, connection_config,
-                 upstream_root=None):
+                 changes_db=None, upstream_root=None):
         super(FakeGithubConnection, self).__init__(driver, connection_name,
                                                    connection_config)
         self.connection_name = connection_name
         self.pr_number = 0
-        self.pull_requests = []
+        self.pull_requests = changes_db
         self.statuses = {}
         self.upstream_root = upstream_root
         self.merge_failure = False
         self.merge_not_allowed_count = 0
         self.reports = []
-        self.github_client = FakeGithub()
+        self.github_client = tests.fakegithub.FakeGithub(changes_db)
 
     def getGithubClient(self,
                         project=None,
@@ -1042,7 +964,7 @@
         pull_request = FakeGithubPullRequest(
             self, self.pr_number, project, branch, subject, self.upstream_root,
             files=files, body=body)
-        self.pull_requests.append(pull_request)
+        self.pull_requests[self.pr_number] = pull_request
         return pull_request
 
     def getPushEvent(self, project, ref, old_rev=None, new_rev=None,
@@ -1089,35 +1011,8 @@
         super(FakeGithubConnection, self).addProject(project)
         self.getGithubClient(project).addProject(project)
 
-    def getPull(self, project, number):
-        pr = self.pull_requests[number - 1]
-        data = {
-            'number': number,
-            'title': pr.subject,
-            'updated_at': pr.updated_at,
-            'base': {
-                'repo': {
-                    'full_name': pr.project
-                },
-                'ref': pr.branch,
-            },
-            'mergeable': True,
-            'state': pr.state,
-            'head': {
-                'sha': pr.head_sha,
-                'repo': {
-                    'full_name': pr.project
-                }
-            },
-            'files': pr.files,
-            'labels': pr.labels,
-            'merged': pr.is_merged,
-            'body': pr.body
-        }
-        return data
-
     def getPullBySha(self, sha, project):
-        prs = list(set([p for p in self.pull_requests if
+        prs = list(set([p for p in self.pull_requests.values() if
                         sha == p.head_sha and project == p.project]))
         if len(prs) > 1:
             raise Exception('Multiple pulls found with head sha: %s' % sha)
@@ -1125,12 +1020,12 @@
         return self.getPull(pr.project, pr.number)
 
     def _getPullReviews(self, owner, project, number):
-        pr = self.pull_requests[number - 1]
+        pr = self.pull_requests[number]
         return pr.reviews
 
     def getRepoPermission(self, project, login):
         owner, proj = project.split('/')
-        for pr in self.pull_requests:
+        for pr in self.pull_requests.values():
             pr_owner, pr_project = pr.project.split('/')
             if (pr_owner == owner and proj == pr_project):
                 if login in pr.writers:
@@ -1147,13 +1042,13 @@
     def commentPull(self, project, pr_number, message):
         # record that this got reported
         self.reports.append((project, pr_number, 'comment'))
-        pull_request = self.pull_requests[pr_number - 1]
+        pull_request = self.pull_requests[pr_number]
         pull_request.addComment(message)
 
     def mergePull(self, project, pr_number, commit_message='', sha=None):
         # record that this got reported
         self.reports.append((project, pr_number, 'merge'))
-        pull_request = self.pull_requests[pr_number - 1]
+        pull_request = self.pull_requests[pr_number]
         if self.merge_failure:
             raise Exception('Pull request was not merged')
         if self.merge_not_allowed_count > 0:
@@ -1173,32 +1068,15 @@
     def labelPull(self, project, pr_number, label):
         # record that this got reported
         self.reports.append((project, pr_number, 'label', label))
-        pull_request = self.pull_requests[pr_number - 1]
+        pull_request = self.pull_requests[pr_number]
         pull_request.addLabel(label)
 
     def unlabelPull(self, project, pr_number, label):
         # record that this got reported
         self.reports.append((project, pr_number, 'unlabel', label))
-        pull_request = self.pull_requests[pr_number - 1]
+        pull_request = self.pull_requests[pr_number]
         pull_request.removeLabel(label)
 
-    def _getNeededByFromPR(self, change):
-        prs = []
-        pattern = re.compile(r"Depends-On.*https://%s/%s/pull/%s" %
-                             (self.server, change.project.name,
-                              change.number))
-        for pr in self.pull_requests:
-            if not pr.body:
-                body = ''
-            else:
-                body = pr.body
-            if pattern.search(body):
-                # Get our version of a pull so that it's a dict
-                pull = self.getPull(pr.project, pr.number)
-                prs.append(pull)
-
-        return prs
-
 
 class BuildHistory(object):
     def __init__(self, **kw):
@@ -1432,7 +1310,8 @@
         self.log.debug("hostlist")
         hosts = super(RecordingAnsibleJob, self).getHostList(args)
         for host in hosts:
-            host['host_vars']['ansible_connection'] = 'local'
+            if not host['host_vars'].get('ansible_connection'):
+                host['host_vars']['ansible_connection'] = 'local'
 
         hosts.append(dict(
             name=['localhost'],
@@ -1738,6 +1617,11 @@
                     executor='fake-nodepool')
         if 'fakeuser' in node_type:
             data['username'] = 'fakeuser'
+        if 'windows' in node_type:
+            data['connection_type'] = 'winrm'
+        if 'network' in node_type:
+            data['connection_type'] = 'network_cli'
+
         data = json.dumps(data).encode('utf8')
         path = self.client.create(path, data,
                                   makepath=True,
@@ -2162,6 +2046,7 @@
         # Set a changes database so multiple FakeGerrit's can report back to
         # a virtual canonical database given by the configured hostname
         self.gerrit_changes_dbs = {}
+        self.github_changes_dbs = {}
 
         def getGerritConnection(driver, name, config):
             db = self.gerrit_changes_dbs.setdefault(config['server'], {})
@@ -2177,7 +2062,10 @@
             getGerritConnection))
 
         def getGithubConnection(driver, name, config):
+            server = config.get('server', 'github.com')
+            db = self.github_changes_dbs.setdefault(server, {})
             con = FakeGithubConnection(driver, name, config,
+                                       changes_db=db,
                                        upstream_root=self.upstream_root)
             self.event_queues.append(con.event_queue)
             setattr(self, 'fake_' + name, con)
@@ -2833,6 +2721,16 @@
                         os.path.join(FIXTURE_DIR, f.name))
         self.setupAllProjectKeys()
 
+    def addTagToRepo(self, project, name, sha):
+        path = os.path.join(self.upstream_root, project)
+        repo = git.Repo(path)
+        repo.git.tag(name, sha)
+
+    def delTagFromRepo(self, project, name):
+        path = os.path.join(self.upstream_root, project)
+        repo = git.Repo(path)
+        repo.git.tag('-d', name)
+
     def addCommitToRepo(self, project, message, files,
                         branch='master', tag=None):
         path = os.path.join(self.upstream_root, project)
diff --git a/tests/fakegithub.py b/tests/fakegithub.py
new file mode 100644
index 0000000..6fb2d66
--- /dev/null
+++ b/tests/fakegithub.py
@@ -0,0 +1,214 @@
+#!/usr/bin/env python
+
+# Copyright 2018 Red Hat, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import re
+
+
+class FakeUser(object):
+    def __init__(self, login):
+        self.login = login
+        self.name = "Github User"
+        self.email = "github.user@example.com"
+
+
+class FakeBranch(object):
+    def __init__(self, branch='master'):
+        self.name = branch
+
+
+class FakeStatus(object):
+    def __init__(self, state, url, description, context, user):
+        self._state = state
+        self._url = url
+        self._description = description
+        self._context = context
+        self._user = user
+
+    def as_dict(self):
+        return {
+            'state': self._state,
+            'url': self._url,
+            'description': self._description,
+            'context': self._context,
+            'creator': {
+                'login': self._user
+            }
+        }
+
+
+class FakeCommit(object):
+    def __init__(self):
+        self._statuses = []
+
+    def set_status(self, state, url, description, context, user):
+        status = FakeStatus(
+            state, url, description, context, user)
+        # always insert a status to the front of the list, to represent
+        # the last status provided for a commit.
+        self._statuses.insert(0, status)
+
+    def statuses(self):
+        return self._statuses
+
+
+class FakeRepository(object):
+    def __init__(self):
+        self._branches = [FakeBranch()]
+        self._commits = {}
+
+    def branches(self, protected=False):
+        if protected:
+            # simulate there is no protected branch
+            return []
+        return self._branches
+
+    def create_status(self, sha, state, url, description, context,
+                      user='zuul'):
+        # Since we're bypassing github API, which would require a user, we
+        # default the user as 'zuul' here.
+        commit = self._commits.get(sha, None)
+        if commit is None:
+            commit = FakeCommit()
+            self._commits[sha] = commit
+        commit.set_status(state, url, description, context, user)
+
+    def commit(self, sha):
+        commit = self._commits.get(sha, None)
+        if commit is None:
+            commit = FakeCommit()
+            self._commits[sha] = commit
+        return commit
+
+
+class FakeLabel(object):
+    def __init__(self, name):
+        self.name = name
+
+
+class FakeIssue(object):
+    def __init__(self, fake_pull_request):
+        self._fake_pull_request = fake_pull_request
+
+    def pull_request(self):
+        return FakePull(self._fake_pull_request)
+
+    def labels(self):
+        return [FakeLabel(l)
+                for l in self._fake_pull_request.labels]
+
+
+class FakeFile(object):
+    def __init__(self, filename):
+        self.filename = filename
+
+
+class FakePull(object):
+    def __init__(self, fake_pull_request):
+        self._fake_pull_request = fake_pull_request
+
+    def issue(self):
+        return FakeIssue(self._fake_pull_request)
+
+    def files(self):
+        return [FakeFile(fn)
+                for fn in self._fake_pull_request.files]
+
+    def as_dict(self):
+        pr = self._fake_pull_request
+        connection = pr.github
+        data = {
+            'number': pr.number,
+            'title': pr.subject,
+            'url': 'https://%s/%s/pull/%s' % (
+                connection.server, pr.project, pr.number
+            ),
+            'updated_at': pr.updated_at,
+            'base': {
+                'repo': {
+                    'full_name': pr.project
+                },
+                'ref': pr.branch,
+            },
+            'mergeable': True,
+            'state': pr.state,
+            'head': {
+                'sha': pr.head_sha,
+                'repo': {
+                    'full_name': pr.project
+                }
+            },
+            'merged': pr.is_merged,
+            'body': pr.body
+        }
+        return data
+
+
+class FakeIssueSearchResult(object):
+    def __init__(self, issue):
+        self.issue = issue
+
+
+class FakeGithub(object):
+    def __init__(self, pull_requests):
+        self._pull_requests = pull_requests
+        self._repos = {}
+
+    def user(self, login):
+        return FakeUser(login)
+
+    def repository(self, owner, proj):
+        return self._repos.get((owner, proj), None)
+
+    def repo_from_project(self, project):
+        # This is a convenience method for the tests.
+        owner, proj = project.split('/')
+        return self.repository(owner, proj)
+
+    def addProject(self, project):
+        owner, proj = project.name.split('/')
+        self._repos[(owner, proj)] = FakeRepository()
+
+    def pull_request(self, owner, project, number):
+        fake_pr = self._pull_requests[number]
+        return FakePull(fake_pr)
+
+    def search_issues(self, query):
+        def tokenize(s):
+            return re.findall(r'[\w]+', s)
+
+        parts = tokenize(query)
+        terms = set()
+        results = []
+        for part in parts:
+            kv = part.split(':', 1)
+            if len(kv) == 2:
+                if kv[0] in set('type', 'is', 'in'):
+                    # We only perform one search now and these aren't
+                    # important; we can honor these terms later if
+                    # necessary.
+                    continue
+            terms.add(part)
+
+        for pr in self._pull_requests.values():
+            if not pr.body:
+                body = set()
+            else:
+                body = set(tokenize(pr.body))
+            if terms.intersection(body):
+                issue = FakeIssue(pr)
+                results.append(FakeIssueSearchResult(issue))
+
+        return results
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-merge.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-merge.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-merge.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-test1.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-test1.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-test1.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-test2.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-test2.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/nonvoting-project-test2.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/project-merge.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-merge.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-merge.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/project-post.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-post.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-post.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/project-test1.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-test1.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-test1.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/project-test2.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-test2.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-test2.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/project-testfile.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-testfile.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/project-testfile.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/playbooks/project1-project2-integration.yaml b/tests/fixtures/config/cross-source/git/common-config/playbooks/project1-project2-integration.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/playbooks/project1-project2-integration.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/cross-source/git/common-config/zuul.yaml b/tests/fixtures/config/cross-source/git/common-config/zuul.yaml
new file mode 100644
index 0000000..abdc34a
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/common-config/zuul.yaml
@@ -0,0 +1,168 @@
+- pipeline:
+    name: check
+    manager: independent
+    trigger:
+      gerrit:
+        - event: patchset-created
+      github:
+        - event: pull_request
+          action: edited
+    success:
+      gerrit:
+        Verified: 1
+      github: {}
+    failure:
+      gerrit:
+        Verified: -1
+      github: {}
+
+- pipeline:
+    name: gate
+    manager: dependent
+    success-message: Build succeeded (gate).
+    require:
+      github:
+        label: approved
+      gerrit:
+        approval:
+          - Approved: 1
+    trigger:
+      gerrit:
+        - event: comment-added
+          approval:
+            - Approved: 1
+      github:
+        - event: pull_request
+          action: edited
+        - event: pull_request
+          action: labeled
+          label: approved
+    success:
+      gerrit:
+        Verified: 2
+        submit: true
+      github:
+        merge: true
+    failure:
+      gerrit:
+        Verified: -2
+      github: {}
+    start:
+      gerrit:
+        Verified: 0
+      github: {}
+    precedence: high
+
+- pipeline:
+    name: post
+    manager: independent
+    trigger:
+      gerrit:
+        - event: ref-updated
+          ref: ^(?!refs/).*$
+    precedence: low
+
+- job:
+    name: base
+    parent: null
+
+- job:
+    name: project-merge
+    hold-following-changes: true
+    nodeset:
+      nodes:
+        - name: controller
+          label: label1
+    run: playbooks/project-merge.yaml
+
+- job:
+    name: project-test1
+    attempts: 4
+    nodeset:
+      nodes:
+        - name: controller
+          label: label1
+    run: playbooks/project-test1.yaml
+
+- job:
+    name: project-test1
+    branches: stable
+    nodeset:
+      nodes:
+        - name: controller
+          label: label2
+    run: playbooks/project-test1.yaml
+
+- job:
+    name: project-post
+    nodeset:
+      nodes:
+        - name: static
+          label: ubuntu-xenial
+    run: playbooks/project-post.yaml
+
+- job:
+    name: project-test2
+    nodeset:
+      nodes:
+        - name: controller
+          label: label1
+    run: playbooks/project-test2.yaml
+
+- job:
+    name: project1-project2-integration
+    nodeset:
+      nodes:
+        - name: controller
+          label: label1
+    run: playbooks/project1-project2-integration.yaml
+
+- job:
+    name: project-testfile
+    files:
+      - .*-requires
+    run: playbooks/project-testfile.yaml
+
+- project:
+    name: gerrit/project1
+    check:
+      jobs:
+        - project-merge
+        - project-test1:
+            dependencies: project-merge
+        - project-test2:
+            dependencies: project-merge
+        - project1-project2-integration:
+            dependencies: project-merge
+    gate:
+      queue: integrated
+      jobs:
+        - project-merge
+        - project-test1:
+            dependencies: project-merge
+        - project-test2:
+            dependencies: project-merge
+        - project1-project2-integration:
+            dependencies: project-merge
+
+- project:
+    name: github/project2
+    check:
+      jobs:
+        - project-merge
+        - project-test1:
+            dependencies: project-merge
+        - project-test2:
+            dependencies: project-merge
+        - project1-project2-integration:
+            dependencies: project-merge
+    gate:
+      queue: integrated
+      jobs:
+        - project-merge
+        - project-test1:
+            dependencies: project-merge
+        - project-test2:
+            dependencies: project-merge
+        - project1-project2-integration:
+            dependencies: project-merge
diff --git a/tests/fixtures/config/cross-source/git/gerrit_project1/README b/tests/fixtures/config/cross-source/git/gerrit_project1/README
new file mode 100644
index 0000000..9daeafb
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/gerrit_project1/README
@@ -0,0 +1 @@
+test
diff --git a/tests/fixtures/config/cross-source/git/github_project2/README b/tests/fixtures/config/cross-source/git/github_project2/README
new file mode 100644
index 0000000..9daeafb
--- /dev/null
+++ b/tests/fixtures/config/cross-source/git/github_project2/README
@@ -0,0 +1 @@
+test
diff --git a/tests/fixtures/config/cross-source/main.yaml b/tests/fixtures/config/cross-source/main.yaml
new file mode 100644
index 0000000..bf85c33
--- /dev/null
+++ b/tests/fixtures/config/cross-source/main.yaml
@@ -0,0 +1,11 @@
+- tenant:
+    name: tenant-one
+    source:
+      gerrit:
+        config-projects:
+          - common-config
+        untrusted-projects:
+          - gerrit/project1
+      github:
+        untrusted-projects:
+          - github/project2
diff --git a/tests/fixtures/config/git-driver/git/common-config/playbooks/project-test2.yaml b/tests/fixtures/config/git-driver/git/common-config/playbooks/project-test2.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/git-driver/git/common-config/playbooks/project-test2.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/git-driver/git/common-config/zuul.yaml b/tests/fixtures/config/git-driver/git/common-config/zuul.yaml
index 784b5f2..53fc210 100644
--- a/tests/fixtures/config/git-driver/git/common-config/zuul.yaml
+++ b/tests/fixtures/config/git-driver/git/common-config/zuul.yaml
@@ -19,6 +19,10 @@
     name: project-test1
     run: playbooks/project-test1.yaml
 
+- job:
+    name: project-test2
+    run: playbooks/project-test2.yaml
+
 - project:
     name: org/project
     check:
diff --git a/tests/fixtures/config/implicit-project/git/common-config/playbooks/test-common.yaml b/tests/fixtures/config/implicit-project/git/common-config/playbooks/test-common.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/implicit-project/git/common-config/playbooks/test-common.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/implicit-project/git/common-config/zuul.yaml b/tests/fixtures/config/implicit-project/git/common-config/zuul.yaml
new file mode 100644
index 0000000..038c412
--- /dev/null
+++ b/tests/fixtures/config/implicit-project/git/common-config/zuul.yaml
@@ -0,0 +1,57 @@
+- pipeline:
+    name: check
+    manager: independent
+    post-review: true
+    trigger:
+      gerrit:
+        - event: patchset-created
+    success:
+      gerrit:
+        Verified: 1
+    failure:
+      gerrit:
+        Verified: -1
+
+- pipeline:
+    name: gate
+    manager: dependent
+    success-message: Build succeeded (gate).
+    trigger:
+      gerrit:
+        - event: comment-added
+          approval:
+            - Approved: 1
+    success:
+      gerrit:
+        Verified: 2
+        submit: true
+    failure:
+      gerrit:
+        Verified: -2
+    start:
+      gerrit:
+        Verified: 0
+    precedence: high
+
+
+- job:
+    name: base
+    parent: null
+
+- job:
+    name: test-common
+    run: playbooks/test-common.yaml
+
+- project:
+    check:
+      jobs:
+        - test-common
+
+- project:
+    name: org/project
+    check:
+      jobs:
+        - test-common
+    gate:
+      jobs:
+        - test-common
diff --git a/tests/fixtures/config/implicit-project/git/org_project/.zuul.yaml b/tests/fixtures/config/implicit-project/git/org_project/.zuul.yaml
new file mode 100644
index 0000000..bce195c
--- /dev/null
+++ b/tests/fixtures/config/implicit-project/git/org_project/.zuul.yaml
@@ -0,0 +1,11 @@
+- job:
+    name: test-project
+    run: playbooks/test-project.yaml
+
+- project:
+    check:
+      jobs:
+        - test-project
+    gate:
+      jobs:
+        - test-project
diff --git a/tests/fixtures/config/implicit-project/git/org_project/playbooks/test-project.yaml b/tests/fixtures/config/implicit-project/git/org_project/playbooks/test-project.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/implicit-project/git/org_project/playbooks/test-project.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/implicit-project/main.yaml b/tests/fixtures/config/implicit-project/main.yaml
new file mode 100644
index 0000000..208e274
--- /dev/null
+++ b/tests/fixtures/config/implicit-project/main.yaml
@@ -0,0 +1,8 @@
+- tenant:
+    name: tenant-one
+    source:
+      gerrit:
+        config-projects:
+          - common-config
+        untrusted-projects:
+          - org/project
diff --git a/tests/fixtures/config/inventory/git/common-config/zuul.yaml b/tests/fixtures/config/inventory/git/common-config/zuul.yaml
index ad530a7..f592eb4 100644
--- a/tests/fixtures/config/inventory/git/common-config/zuul.yaml
+++ b/tests/fixtures/config/inventory/git/common-config/zuul.yaml
@@ -38,6 +38,10 @@
         label: default-label
       - name: fakeuser
         label: fakeuser-label
+      - name: windows
+        label: windows-label
+      - name: network
+        label: network-label
 
 - job:
     name: base
diff --git a/tests/fixtures/config/protected/git/common-config/zuul.yaml b/tests/fixtures/config/protected/git/common-config/zuul.yaml
new file mode 100644
index 0000000..c941573
--- /dev/null
+++ b/tests/fixtures/config/protected/git/common-config/zuul.yaml
@@ -0,0 +1,16 @@
+- pipeline:
+    name: check
+    manager: independent
+    trigger:
+      gerrit:
+        - event: patchset-created
+    success:
+      gerrit:
+        Verified: 1
+    failure:
+      gerrit:
+        Verified: -1
+
+- job:
+    name: base
+    parent: null
diff --git a/tests/fixtures/config/protected/git/org_project/playbooks/job-protected.yaml b/tests/fixtures/config/protected/git/org_project/playbooks/job-protected.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/protected/git/org_project/playbooks/job-protected.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/protected/git/org_project/zuul.yaml b/tests/fixtures/config/protected/git/org_project/zuul.yaml
new file mode 100644
index 0000000..95f33df
--- /dev/null
+++ b/tests/fixtures/config/protected/git/org_project/zuul.yaml
@@ -0,0 +1,9 @@
+- job:
+    name: job-protected
+    protected: true
+    run: playbooks/job-protected.yaml
+
+- project:
+    name: org/project
+    check:
+      jobs: []
diff --git a/tests/fixtures/config/protected/git/org_project1/README b/tests/fixtures/config/protected/git/org_project1/README
new file mode 100644
index 0000000..9daeafb
--- /dev/null
+++ b/tests/fixtures/config/protected/git/org_project1/README
@@ -0,0 +1 @@
+test
diff --git a/tests/fixtures/config/protected/git/org_project1/playbooks/job-child-notok.yaml b/tests/fixtures/config/protected/git/org_project1/playbooks/job-child-notok.yaml
new file mode 100644
index 0000000..f679dce
--- /dev/null
+++ b/tests/fixtures/config/protected/git/org_project1/playbooks/job-child-notok.yaml
@@ -0,0 +1,2 @@
+- hosts: all
+  tasks: []
diff --git a/tests/fixtures/config/protected/git/org_project1/playbooks/placeholder b/tests/fixtures/config/protected/git/org_project1/playbooks/placeholder
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/tests/fixtures/config/protected/git/org_project1/playbooks/placeholder
diff --git a/tests/fixtures/config/protected/main.yaml b/tests/fixtures/config/protected/main.yaml
new file mode 100644
index 0000000..5f57245
--- /dev/null
+++ b/tests/fixtures/config/protected/main.yaml
@@ -0,0 +1,9 @@
+- tenant:
+    name: tenant-one
+    source:
+      gerrit:
+        config-projects:
+          - common-config
+        untrusted-projects:
+          - org/project
+          - org/project1
diff --git a/tests/fixtures/layouts/basic-git.yaml b/tests/fixtures/layouts/basic-git.yaml
new file mode 100644
index 0000000..068d0a0
--- /dev/null
+++ b/tests/fixtures/layouts/basic-git.yaml
@@ -0,0 +1,37 @@
+- pipeline:
+    name: post
+    manager: independent
+    trigger:
+      git:
+        - event: ref-updated
+          ref: ^refs/heads/.*$
+
+- pipeline:
+    name: tag
+    manager: independent
+    trigger:
+      git:
+        - event: ref-updated
+          ref: ^refs/tags/.*$
+
+- job:
+    name: base
+    parent: null
+    run: playbooks/base.yaml
+
+- job:
+    name: post-job
+    run: playbooks/post-job.yaml
+
+- job:
+    name: tag-job
+    run: playbooks/post-job.yaml
+
+- project:
+    name: org/project
+    post:
+      jobs:
+        - post-job
+    tag:
+      jobs:
+        - tag-job
diff --git a/tests/fixtures/zuul-gerrit-github.conf b/tests/fixtures/zuul-gerrit-github.conf
new file mode 100644
index 0000000..d3cbf7b
--- /dev/null
+++ b/tests/fixtures/zuul-gerrit-github.conf
@@ -0,0 +1,35 @@
+[gearman]
+server=127.0.0.1
+
+[statsd]
+# note, use 127.0.0.1 rather than localhost to avoid getting ipv6
+# see: https://github.com/jsocol/pystatsd/issues/61
+server=127.0.0.1
+
+[scheduler]
+tenant_config=main.yaml
+
+[merger]
+git_dir=/tmp/zuul-test/merger-git
+git_user_email=zuul@example.com
+git_user_name=zuul
+
+[executor]
+git_dir=/tmp/zuul-test/executor-git
+
+[connection gerrit]
+driver=gerrit
+server=review.example.com
+user=jenkins
+sshkey=fake_id_rsa_path
+
+[connection github]
+driver=github
+webhook_token=0000000000000000000000000000000000000000
+
+[connection smtp]
+driver=smtp
+server=localhost
+port=25
+default_from=zuul@example.com
+default_to=you@example.com
diff --git a/tests/fixtures/zuul-git-driver.conf b/tests/fixtures/zuul-git-driver.conf
index b24b0a1..23a2a62 100644
--- a/tests/fixtures/zuul-git-driver.conf
+++ b/tests/fixtures/zuul-git-driver.conf
@@ -21,6 +21,7 @@
 [connection git]
 driver=git
 baseurl=""
+poll_delay=0.1
 
 [connection outgoing_smtp]
 driver=smtp
diff --git a/tests/unit/test_connection.py b/tests/unit/test_connection.py
index 054ee5f..c45da94 100644
--- a/tests/unit/test_connection.py
+++ b/tests/unit/test_connection.py
@@ -115,11 +115,11 @@
         self.assertEqual('check', buildset0['pipeline'])
         self.assertEqual('org/project', buildset0['project'])
         self.assertEqual(1, buildset0['change'])
-        self.assertEqual(1, buildset0['patchset'])
+        self.assertEqual('1', buildset0['patchset'])
         self.assertEqual('SUCCESS', buildset0['result'])
         self.assertEqual('Build succeeded.', buildset0['message'])
         self.assertEqual('tenant-one', buildset0['tenant'])
-        self.assertEqual('https://hostname/%d' % buildset0['change'],
+        self.assertEqual('https://review.example.com/%d' % buildset0['change'],
                          buildset0['ref_url'])
 
         buildset0_builds = conn.execute(
@@ -141,7 +141,7 @@
         self.assertEqual('check', buildset1['pipeline'])
         self.assertEqual('org/project', buildset1['project'])
         self.assertEqual(2, buildset1['change'])
-        self.assertEqual(1, buildset1['patchset'])
+        self.assertEqual('1', buildset1['patchset'])
         self.assertEqual('FAILURE', buildset1['result'])
         self.assertEqual('Build failed.', buildset1['message'])
 
@@ -194,7 +194,7 @@
         self.assertEqual('check', buildsets_resultsdb[0]['pipeline'])
         self.assertEqual('org/project', buildsets_resultsdb[0]['project'])
         self.assertEqual(1, buildsets_resultsdb[0]['change'])
-        self.assertEqual(1, buildsets_resultsdb[0]['patchset'])
+        self.assertEqual('1', buildsets_resultsdb[0]['patchset'])
         self.assertEqual('SUCCESS', buildsets_resultsdb[0]['result'])
         self.assertEqual('Build succeeded.', buildsets_resultsdb[0]['message'])
 
@@ -215,7 +215,7 @@
         self.assertEqual(
             'org/project', buildsets_resultsdb_failures[0]['project'])
         self.assertEqual(2, buildsets_resultsdb_failures[0]['change'])
-        self.assertEqual(1, buildsets_resultsdb_failures[0]['patchset'])
+        self.assertEqual('1', buildsets_resultsdb_failures[0]['patchset'])
         self.assertEqual('FAILURE', buildsets_resultsdb_failures[0]['result'])
         self.assertEqual(
             'Build failed.', buildsets_resultsdb_failures[0]['message'])
diff --git a/tests/unit/test_cross_crd.py b/tests/unit/test_cross_crd.py
new file mode 100644
index 0000000..7d68989
--- /dev/null
+++ b/tests/unit/test_cross_crd.py
@@ -0,0 +1,950 @@
+#!/usr/bin/env python
+
+# Copyright 2012 Hewlett-Packard Development Company, L.P.
+# Copyright 2018 Red Hat, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+from tests.base import (
+    ZuulTestCase,
+)
+
+
+class TestGerritToGithubCRD(ZuulTestCase):
+    config_file = 'zuul-gerrit-github.conf'
+    tenant_config_file = 'config/cross-source/main.yaml'
+
+    def test_crd_gate(self):
+        "Test cross-repo dependencies"
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'B')
+
+        A.addApproval('Code-Review', 2)
+
+        AM2 = self.fake_gerrit.addFakeChange('gerrit/project1', 'master',
+                                             'AM2')
+        AM1 = self.fake_gerrit.addFakeChange('gerrit/project1', 'master',
+                                             'AM1')
+        AM2.setMerged()
+        AM1.setMerged()
+
+        # A -> AM1 -> AM2
+        # A Depends-On: B
+        # M2 is here to make sure it is never queried.  If it is, it
+        # means zuul is walking down the entire history of merged
+        # changes.
+
+        A.setDependsOn(AM1, 1)
+        AM1.setDependsOn(AM2, 1)
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertFalse(B.is_merged)
+
+        for connection in self.connections.connections.values():
+            connection.maintainCache([])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addLabel('approved')
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(AM2.queried, 0)
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertTrue(B.is_merged)
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(len(B.comments), 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'gerrit/project1').changes
+        self.assertEqual(changes, '1,%s 1,1' % B.head_sha)
+
+    def test_crd_branch(self):
+        "Test cross-repo dependencies in multiple branches"
+
+        self.create_branch('github/project2', 'mp')
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'B')
+        C1 = self.fake_github.openFakePullRequest('github/project2', 'mp',
+                                                  'C1')
+
+        A.addApproval('Code-Review', 2)
+
+        # A Depends-On: B+C1
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\nDepends-On: %s\n' % (
+            A.subject, B.url, C1.url)
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addLabel('approved')
+        C1.addLabel('approved')
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertTrue(B.is_merged)
+        self.assertTrue(C1.is_merged)
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(len(B.comments), 2)
+        self.assertEqual(len(C1.comments), 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'gerrit/project1').changes
+        self.assertEqual(changes, '1,%s 2,%s 1,1' %
+                         (B.head_sha, C1.head_sha))
+
+    def test_crd_gate_reverse(self):
+        "Test reverse cross-repo dependencies"
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'B')
+        A.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertFalse(B.is_merged)
+
+        self.executor_server.hold_jobs_in_build = True
+        A.addApproval('Approved', 1)
+        self.fake_github.emitEvent(B.addLabel('approved'))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertTrue(B.is_merged)
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(len(B.comments), 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'gerrit/project1').changes
+        self.assertEqual(changes, '1,%s 1,1' %
+                         (B.head_sha,))
+
+    def test_crd_cycle(self):
+        "Test cross-repo dependency cycles"
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        msg = "Depends-On: %s" % (A.data['url'],)
+        B = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'B', body=msg)
+        A.addApproval('Code-Review', 2)
+        B.addLabel('approved')
+
+        # A -> B -> A (via commit-depends)
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(len(B.comments), 0)
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertFalse(B.is_merged)
+
+    def test_crd_gate_unknown(self):
+        "Test unknown projects in dependent pipeline"
+        self.init_repo("github/unknown", tag='init')
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest('github/unknown', 'master',
+                                                 'B')
+        A.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        event = B.addLabel('approved')
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        # Unknown projects cannot share a queue with any other
+        # since they don't have common jobs with any other (they have no jobs).
+        # Changes which depend on unknown project changes
+        # should not be processed in dependent pipeline
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertFalse(B.is_merged)
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(len(B.comments), 0)
+        self.assertEqual(len(self.history), 0)
+
+        # Simulate change B being gated outside this layout Set the
+        # change merged before submitting the event so that when the
+        # event triggers a gerrit query to update the change, we get
+        # the information that it was merged.
+        B.setMerged('merged')
+        self.fake_github.emitEvent(event)
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 0)
+
+        # Now that B is merged, A should be able to be enqueued and
+        # merged.
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertTrue(B.is_merged)
+        self.assertEqual(len(B.comments), 0)
+
+    def test_crd_check(self):
+        "Test cross-repo dependencies in independent pipelines"
+        self.executor_server.hold_jobs_in_build = True
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest(
+            'github/project2', 'master', 'B')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+
+        self.assertTrue(self.builds[0].hasChanges(A, B))
+
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertFalse(B.is_merged)
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(len(B.comments), 0)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'gerrit/project1').changes
+        self.assertEqual(changes, '1,%s 1,1' %
+                         (B.head_sha,))
+
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_duplicate(self):
+        "Test duplicate check in independent pipelines"
+        self.executor_server.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest(
+            'github/project2', 'master', 'B')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        check_pipeline = tenant.layout.pipelines['check']
+
+        # Add two dependent changes...
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...make sure the live one is not duplicated...
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...but the non-live one is able to be.
+        self.fake_github.emitEvent(B.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 3)
+
+        # Release jobs in order to avoid races with change A jobs
+        # finishing before change B jobs.
+        self.orderedRelease()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertFalse(B.is_merged)
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(len(B.comments), 1)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'gerrit/project1').changes
+        self.assertEqual(changes, '1,%s 1,1' %
+                         (B.head_sha,))
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'github/project2').changes
+        self.assertEqual(changes, '1,%s' %
+                         (B.head_sha,))
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+        self.assertIn('Build succeeded', A.messages[0])
+
+    def _test_crd_check_reconfiguration(self, project1, project2):
+        "Test cross-repo dependencies re-enqueued in independent pipelines"
+
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest(
+            'github/project2', 'master', 'B')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.sched.reconfigure(self.config)
+
+        # Make sure the items still share a change queue, and the
+        # first one is not live.
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 1)
+        queue = tenant.layout.pipelines['check'].queues[0]
+        first_item = queue.queue[0]
+        for item in queue.queue:
+            self.assertEqual(item.queue, first_item.queue)
+        self.assertFalse(first_item.live)
+        self.assertTrue(queue.queue[1].live)
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertFalse(B.is_merged)
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(len(B.comments), 0)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'gerrit/project1').changes
+        self.assertEqual(changes, '1,%s 1,1' %
+                         (B.head_sha,))
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_reconfiguration(self):
+        self._test_crd_check_reconfiguration('org/project1', 'org/project2')
+
+    def test_crd_undefined_project(self):
+        """Test that undefined projects in dependencies are handled for
+        independent pipelines"""
+        # It's a hack for fake github,
+        # as it implies repo creation upon the creation of any change
+        self.init_repo("github/unknown", tag='init')
+        self._test_crd_check_reconfiguration('gerrit/project1',
+                                             'github/unknown')
+
+    def test_crd_check_transitive(self):
+        "Test transitive cross-repo dependencies"
+        # Specifically, if A -> B -> C, and C gets a new patchset and
+        # A gets a new patchset, ensure the test of A,2 includes B,1
+        # and C,2 (not C,1 which would indicate stale data in the
+        # cache for B).
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        C = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'C')
+        # B Depends-On: C
+        msg = "Depends-On: %s" % (C.data['url'],)
+        B = self.fake_github.openFakePullRequest(
+            'github/project2', 'master', 'B', body=msg)
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,1 1,%s 1,1' %
+                         (B.head_sha,))
+
+        self.fake_github.emitEvent(B.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,1 1,%s' %
+                         (B.head_sha,))
+
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,1')
+
+        C.addPatchset()
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,2')
+
+        A.addPatchset()
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,2 1,%s 1,2' %
+                         (B.head_sha,))
+
+    def test_crd_check_unknown(self):
+        "Test unknown projects in independent pipeline"
+        self.init_repo("github/unknown", tag='init')
+        A = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'A')
+        B = self.fake_github.openFakePullRequest(
+            'github/unknown', 'master', 'B')
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+
+        # Make sure zuul has seen an event on B.
+        self.fake_github.emitEvent(B.getPullRequestEditedEvent())
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertFalse(B.is_merged)
+        self.assertEqual(len(B.comments), 0)
+
+    def test_crd_cycle_join(self):
+        "Test an updated change creates a cycle"
+        A = self.fake_github.openFakePullRequest(
+            'github/project2', 'master', 'A')
+
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(len(A.comments), 1)
+
+        # Create B->A
+        B = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'B')
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.url)
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # Dep is there so zuul should have reported on B
+        self.assertEqual(B.reported, 1)
+
+        # Update A to add A->B (a cycle).
+        A.editBody('Depends-On: %s\n' % (B.data['url']))
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+
+        # Dependency cycle injected so zuul should not have reported again on A
+        self.assertEqual(len(A.comments), 1)
+
+        # Now if we update B to remove the depends-on, everything
+        # should be okay.  B; A->B
+
+        B.addPatchset()
+        B.data['commitMessage'] = '%s\n' % (B.subject,)
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+
+        # Cycle was removed so now zuul should have reported again on A
+        self.assertEqual(len(A.comments), 2)
+
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(B.reported, 2)
+
+
+class TestGithubToGerritCRD(ZuulTestCase):
+    config_file = 'zuul-gerrit-github.conf'
+    tenant_config_file = 'config/cross-source/main.yaml'
+
+    def test_crd_gate(self):
+        "Test cross-repo dependencies"
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'B')
+
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url']))
+
+        event = A.addLabel('approved')
+        self.fake_github.emitEvent(event)
+        self.waitUntilSettled()
+
+        self.assertFalse(A.is_merged)
+        self.assertEqual(B.data['status'], 'NEW')
+
+        for connection in self.connections.connections.values():
+            connection.maintainCache([])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        self.fake_github.emitEvent(event)
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertTrue(A.is_merged)
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(len(A.comments), 2)
+        self.assertEqual(B.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'github/project2').changes
+        self.assertEqual(changes, '1,1 1,%s' % A.head_sha)
+
+    def test_crd_branch(self):
+        "Test cross-repo dependencies in multiple branches"
+
+        self.create_branch('gerrit/project1', 'mp')
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'B')
+        C1 = self.fake_gerrit.addFakeChange('gerrit/project1', 'mp', 'C1')
+
+        B.addApproval('Code-Review', 2)
+        C1.addApproval('Code-Review', 2)
+
+        # A Depends-On: B+C1
+        A.editBody('Depends-On: %s\nDepends-On: %s\n' % (
+            B.data['url'], C1.data['url']))
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        C1.addApproval('Approved', 1)
+        self.fake_github.emitEvent(A.addLabel('approved'))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+        self.assertTrue(A.is_merged)
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(C1.data['status'], 'MERGED')
+        self.assertEqual(len(A.comments), 2)
+        self.assertEqual(B.reported, 2)
+        self.assertEqual(C1.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'github/project2').changes
+        self.assertEqual(changes, '1,1 2,1 1,%s' %
+                         (A.head_sha,))
+
+    def test_crd_gate_reverse(self):
+        "Test reverse cross-repo dependencies"
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'B')
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+
+        self.fake_github.emitEvent(A.addLabel('approved'))
+        self.waitUntilSettled()
+
+        self.assertFalse(A.is_merged)
+        self.assertEqual(B.data['status'], 'NEW')
+
+        self.executor_server.hold_jobs_in_build = True
+        A.addLabel('approved')
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertTrue(A.is_merged)
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(len(A.comments), 2)
+        self.assertEqual(B.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'github/project2').changes
+        self.assertEqual(changes, '1,1 1,%s' %
+                         (A.head_sha,))
+
+    def test_crd_cycle(self):
+        "Test cross-repo dependency cycles"
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'B')
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.url)
+
+        B.addApproval('Code-Review', 2)
+        B.addApproval('Approved', 1)
+
+        # A -> B -> A (via commit-depends)
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+
+        self.fake_github.emitEvent(A.addLabel('approved'))
+        self.waitUntilSettled()
+
+        self.assertEqual(len(A.comments), 0)
+        self.assertEqual(B.reported, 0)
+        self.assertFalse(A.is_merged)
+        self.assertEqual(B.data['status'], 'NEW')
+
+    def test_crd_gate_unknown(self):
+        "Test unknown projects in dependent pipeline"
+        self.init_repo("gerrit/unknown", tag='init')
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange('gerrit/unknown', 'master', 'B')
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+
+        B.addApproval('Approved', 1)
+        event = A.addLabel('approved')
+        self.fake_github.emitEvent(event)
+        self.waitUntilSettled()
+
+        # Unknown projects cannot share a queue with any other
+        # since they don't have common jobs with any other (they have no jobs).
+        # Changes which depend on unknown project changes
+        # should not be processed in dependent pipeline
+        self.assertFalse(A.is_merged)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(len(A.comments), 0)
+        self.assertEqual(B.reported, 0)
+        self.assertEqual(len(self.history), 0)
+
+        # Simulate change B being gated outside this layout Set the
+        # change merged before submitting the event so that when the
+        # event triggers a gerrit query to update the change, we get
+        # the information that it was merged.
+        B.setMerged()
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 0)
+
+        # Now that B is merged, A should be able to be enqueued and
+        # merged.
+        self.fake_github.emitEvent(event)
+        self.waitUntilSettled()
+
+        self.assertTrue(A.is_merged)
+        self.assertEqual(len(A.comments), 2)
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(B.reported, 0)
+
+    def test_crd_check(self):
+        "Test cross-repo dependencies in independent pipelines"
+        self.executor_server.hold_jobs_in_build = True
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange(
+            'gerrit/project1', 'master', 'B')
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+
+        self.assertTrue(self.builds[0].hasChanges(A, B))
+
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertFalse(A.is_merged)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(len(A.comments), 1)
+        self.assertEqual(B.reported, 0)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'github/project2').changes
+        self.assertEqual(changes, '1,1 1,%s' %
+                         (A.head_sha,))
+
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_duplicate(self):
+        "Test duplicate check in independent pipelines"
+        self.executor_server.hold_jobs_in_build = True
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange(
+            'gerrit/project1', 'master', 'B')
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        check_pipeline = tenant.layout.pipelines['check']
+
+        # Add two dependent changes...
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...make sure the live one is not duplicated...
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...but the non-live one is able to be.
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 3)
+
+        # Release jobs in order to avoid races with change A jobs
+        # finishing before change B jobs.
+        self.orderedRelease()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertFalse(A.is_merged)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(len(A.comments), 1)
+        self.assertEqual(B.reported, 1)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'github/project2').changes
+        self.assertEqual(changes, '1,1 1,%s' %
+                         (A.head_sha,))
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'gerrit/project1').changes
+        self.assertEqual(changes, '1,1')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+        self.assertIn('Build succeeded', A.comments[0])
+
+    def _test_crd_check_reconfiguration(self, project1, project2):
+        "Test cross-repo dependencies re-enqueued in independent pipelines"
+
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange(
+            'gerrit/project1', 'master', 'B')
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+
+        self.sched.reconfigure(self.config)
+
+        # Make sure the items still share a change queue, and the
+        # first one is not live.
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 1)
+        queue = tenant.layout.pipelines['check'].queues[0]
+        first_item = queue.queue[0]
+        for item in queue.queue:
+            self.assertEqual(item.queue, first_item.queue)
+        self.assertFalse(first_item.live)
+        self.assertTrue(queue.queue[1].live)
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.assertFalse(A.is_merged)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(len(A.comments), 1)
+        self.assertEqual(B.reported, 0)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'github/project2').changes
+        self.assertEqual(changes, '1,1 1,%s' %
+                         (A.head_sha,))
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_reconfiguration(self):
+        self._test_crd_check_reconfiguration('org/project1', 'org/project2')
+
+    def test_crd_undefined_project(self):
+        """Test that undefined projects in dependencies are handled for
+        independent pipelines"""
+        # It's a hack for fake gerrit,
+        # as it implies repo creation upon the creation of any change
+        self.init_repo("gerrit/unknown", tag='init')
+        self._test_crd_check_reconfiguration('github/project2',
+                                             'gerrit/unknown')
+
+    def test_crd_check_transitive(self):
+        "Test transitive cross-repo dependencies"
+        # Specifically, if A -> B -> C, and C gets a new patchset and
+        # A gets a new patchset, ensure the test of A,2 includes B,1
+        # and C,2 (not C,1 which would indicate stale data in the
+        # cache for B).
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange('gerrit/project1', 'master', 'B')
+        C = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'C')
+
+        # B Depends-On: C
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, C.url)
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,%s 1,1 1,%s' %
+                         (C.head_sha, A.head_sha))
+
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,%s 1,1' %
+                         (C.head_sha,))
+
+        self.fake_github.emitEvent(C.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,%s' %
+                         (C.head_sha,))
+
+        new_c_head = C.head_sha
+        C.addCommit()
+        old_c_head = C.head_sha
+        self.assertNotEqual(old_c_head, new_c_head)
+        self.fake_github.emitEvent(C.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,%s' %
+                         (C.head_sha,))
+
+        new_a_head = A.head_sha
+        A.addCommit()
+        old_a_head = A.head_sha
+        self.assertNotEqual(old_a_head, new_a_head)
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '2,%s 1,1 1,%s' %
+                         (C.head_sha, A.head_sha,))
+
+    def test_crd_check_unknown(self):
+        "Test unknown projects in independent pipeline"
+        self.init_repo("gerrit/unknown", tag='init')
+        A = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'A')
+        B = self.fake_gerrit.addFakeChange(
+            'gerrit/unknown', 'master', 'B')
+
+        # A Depends-On: B
+        A.editBody('Depends-On: %s\n' % (B.data['url'],))
+
+        # Make sure zuul has seen an event on B.
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.fake_github.emitEvent(A.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+
+        self.assertFalse(A.is_merged)
+        self.assertEqual(len(A.comments), 1)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(B.reported, 0)
+
+    def test_crd_cycle_join(self):
+        "Test an updated change creates a cycle"
+        A = self.fake_gerrit.addFakeChange(
+            'gerrit/project1', 'master', 'A')
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(A.reported, 1)
+
+        # Create B->A
+        B = self.fake_github.openFakePullRequest('github/project2', 'master',
+                                                 'B')
+        B.editBody('Depends-On: %s\n' % (A.data['url'],))
+        self.fake_github.emitEvent(B.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+
+        # Dep is there so zuul should have reported on B
+        self.assertEqual(len(B.comments), 1)
+
+        # Update A to add A->B (a cycle).
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.url)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # Dependency cycle injected so zuul should not have reported again on A
+        self.assertEqual(A.reported, 1)
+
+        # Now if we update B to remove the depends-on, everything
+        # should be okay.  B; A->B
+
+        B.addCommit()
+        B.editBody('')
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # Cycle was removed so now zuul should have reported again on A
+        self.assertEqual(A.reported, 2)
+
+        self.fake_github.emitEvent(B.getPullRequestEditedEvent())
+        self.waitUntilSettled()
+        self.assertEqual(len(B.comments), 2)
diff --git a/tests/unit/test_gerrit_crd.py b/tests/unit/test_gerrit_crd.py
new file mode 100644
index 0000000..a8924b9
--- /dev/null
+++ b/tests/unit/test_gerrit_crd.py
@@ -0,0 +1,685 @@
+#!/usr/bin/env python
+
+# Copyright 2012 Hewlett-Packard Development Company, L.P.
+# Copyright 2018 Red Hat, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+from tests.base import (
+    ZuulTestCase,
+    simple_layout,
+)
+
+
+class TestGerritCRD(ZuulTestCase):
+    tenant_config_file = 'config/single-tenant/main.yaml'
+
+    def test_crd_gate(self):
+        "Test cross-repo dependencies"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        AM2 = self.fake_gerrit.addFakeChange('org/project1', 'master', 'AM2')
+        AM1 = self.fake_gerrit.addFakeChange('org/project1', 'master', 'AM1')
+        AM2.setMerged()
+        AM1.setMerged()
+
+        BM2 = self.fake_gerrit.addFakeChange('org/project2', 'master', 'BM2')
+        BM1 = self.fake_gerrit.addFakeChange('org/project2', 'master', 'BM1')
+        BM2.setMerged()
+        BM1.setMerged()
+
+        # A -> AM1 -> AM2
+        # B -> BM1 -> BM2
+        # A Depends-On: B
+        # M2 is here to make sure it is never queried.  If it is, it
+        # means zuul is walking down the entire history of merged
+        # changes.
+
+        B.setDependsOn(BM1, 1)
+        BM1.setDependsOn(BM2, 1)
+
+        A.setDependsOn(AM1, 1)
+        AM1.setDependsOn(AM2, 1)
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+
+        for connection in self.connections.connections.values():
+            connection.maintainCache([])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(AM2.queried, 0)
+        self.assertEqual(BM2.queried, 0)
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 1,1')
+
+    def test_crd_gate_triangle(self):
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+        C.addApproval('Code-Review', 2)
+        A.addApproval('Approved', 1)
+        B.addApproval('Approved', 1)
+
+        # C-->B
+        #  \ /
+        #   v
+        #   A
+
+        # C Depends-On: A
+        C.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            C.subject, A.data['url'])
+        # B Depends-On: A
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.data['url'])
+        # C git-depends on B
+        C.setDependsOn(B, 1)
+        self.fake_gerrit.addEvent(C.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+        self.assertEqual(C.reported, 2)
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(C.data['status'], 'MERGED')
+        self.assertEqual(self.history[-1].changes, '1,1 2,1 3,1')
+
+    def test_crd_branch(self):
+        "Test cross-repo dependencies in multiple branches"
+
+        self.create_branch('org/project2', 'mp')
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C1 = self.fake_gerrit.addFakeChange('org/project2', 'mp', 'C1')
+
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+        C1.addApproval('Code-Review', 2)
+
+        # A Depends-On: B+C1
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\nDepends-On: %s\n' % (
+            A.subject, B.data['url'], C1.data['url'])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        C1.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(C1.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+        self.assertEqual(C1.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 3,1 1,1')
+
+    def test_crd_multiline(self):
+        "Test multiple depends-on lines in commit"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+        C.addApproval('Code-Review', 2)
+
+        # A Depends-On: B+C
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\nDepends-On: %s\n' % (
+            A.subject, B.data['url'], C.data['url'])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        C.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(C.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+        self.assertEqual(C.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 3,1 1,1')
+
+    def test_crd_unshared_gate(self):
+        "Test cross-repo dependencies in unshared gate queues"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        # A and B do not share a queue, make sure that A is unable to
+        # enqueue B (and therefore, A is unable to be enqueued).
+        B.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(B.reported, 0)
+        self.assertEqual(len(self.history), 0)
+
+        # Enqueue and merge B alone.
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(B.reported, 2)
+
+        # Now that B is merged, A should be able to be enqueued and
+        # merged.
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+
+    def test_crd_gate_reverse(self):
+        "Test reverse cross-repo dependencies"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+
+        self.executor_server.hold_jobs_in_build = True
+        A.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 1,1')
+
+    def test_crd_cycle(self):
+        "Test cross-repo dependency cycles"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+        B.addApproval('Approved', 1)
+
+        # A -> B -> A (via commit-depends)
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.data['url'])
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(B.reported, 0)
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+
+    def test_crd_gate_unknown(self):
+        "Test unknown projects in dependent pipeline"
+        self.init_repo("org/unknown", tag='init')
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/unknown', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        B.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        # Unknown projects cannot share a queue with any other
+        # since they don't have common jobs with any other (they have no jobs).
+        # Changes which depend on unknown project changes
+        # should not be processed in dependent pipeline
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(B.reported, 0)
+        self.assertEqual(len(self.history), 0)
+
+        # Simulate change B being gated outside this layout Set the
+        # change merged before submitting the event so that when the
+        # event triggers a gerrit query to update the change, we get
+        # the information that it was merged.
+        B.setMerged()
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 0)
+
+        # Now that B is merged, A should be able to be enqueued and
+        # merged.
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(B.reported, 0)
+
+    def test_crd_check(self):
+        "Test cross-repo dependencies in independent pipelines"
+
+        self.executor_server.hold_jobs_in_build = True
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+
+        self.assertTrue(self.builds[0].hasChanges(A, B))
+
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 0)
+
+        self.assertEqual(self.history[0].changes, '2,1 1,1')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_git_depends(self):
+        "Test single-repo dependencies in independent pipelines"
+        self.gearman_server.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+
+        # Add two git-dependent changes and make sure they both report
+        # success.
+        B.setDependsOn(A, 1)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.orderedRelease()
+        self.gearman_server.hold_jobs_in_build = False
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 1)
+
+        self.assertEqual(self.history[0].changes, '1,1')
+        self.assertEqual(self.history[-1].changes, '1,1 2,1')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+        self.assertIn('Build succeeded', A.messages[0])
+        self.assertIn('Build succeeded', B.messages[0])
+
+    def test_crd_check_duplicate(self):
+        "Test duplicate check in independent pipelines"
+        self.executor_server.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        check_pipeline = tenant.layout.pipelines['check']
+
+        # Add two git-dependent changes...
+        B.setDependsOn(A, 1)
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...make sure the live one is not duplicated...
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...but the non-live one is able to be.
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 3)
+
+        # Release jobs in order to avoid races with change A jobs
+        # finishing before change B jobs.
+        self.orderedRelease()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 1)
+
+        self.assertEqual(self.history[0].changes, '1,1 2,1')
+        self.assertEqual(self.history[1].changes, '1,1')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+        self.assertIn('Build succeeded', A.messages[0])
+        self.assertIn('Build succeeded', B.messages[0])
+
+    def _test_crd_check_reconfiguration(self, project1, project2):
+        "Test cross-repo dependencies re-enqueued in independent pipelines"
+
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange(project1, 'master', 'A')
+        B = self.fake_gerrit.addFakeChange(project2, 'master', 'B')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.sched.reconfigure(self.config)
+
+        # Make sure the items still share a change queue, and the
+        # first one is not live.
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 1)
+        queue = tenant.layout.pipelines['check'].queues[0]
+        first_item = queue.queue[0]
+        for item in queue.queue:
+            self.assertEqual(item.queue, first_item.queue)
+        self.assertFalse(first_item.live)
+        self.assertTrue(queue.queue[1].live)
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 0)
+
+        self.assertEqual(self.history[0].changes, '2,1 1,1')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_reconfiguration(self):
+        self._test_crd_check_reconfiguration('org/project1', 'org/project2')
+
+    def test_crd_undefined_project(self):
+        """Test that undefined projects in dependencies are handled for
+        independent pipelines"""
+        # It's a hack for fake gerrit,
+        # as it implies repo creation upon the creation of any change
+        self.init_repo("org/unknown", tag='init')
+        self._test_crd_check_reconfiguration('org/project1', 'org/unknown')
+
+    @simple_layout('layouts/ignore-dependencies.yaml')
+    def test_crd_check_ignore_dependencies(self):
+        "Test cross-repo dependencies can be ignored"
+
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+        # C git-depends on B
+        C.setDependsOn(B, 1)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # Make sure none of the items share a change queue, and all
+        # are live.
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        check_pipeline = tenant.layout.pipelines['check']
+        self.assertEqual(len(check_pipeline.queues), 3)
+        self.assertEqual(len(check_pipeline.getAllItems()), 3)
+        for item in check_pipeline.getAllItems():
+            self.assertTrue(item.live)
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(C.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 1)
+        self.assertEqual(C.reported, 1)
+
+        # Each job should have tested exactly one change
+        for job in self.history:
+            self.assertEqual(len(job.changes.split()), 1)
+
+    def test_crd_check_triangle(self):
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
+
+        # C-->B
+        #  \ /
+        #   v
+        #   A
+
+        # C Depends-On: A
+        C.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            C.subject, A.data['url'])
+        # B Depends-On: A
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.data['url'])
+        # C git-depends on B
+        C.setDependsOn(B, 1)
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.assertEqual(C.reported, 1)
+        self.assertEqual(self.history[0].changes, '1,1 2,1 3,1')
+
+    @simple_layout('layouts/three-projects.yaml')
+    def test_crd_check_transitive(self):
+        "Test transitive cross-repo dependencies"
+        # Specifically, if A -> B -> C, and C gets a new patchset and
+        # A gets a new patchset, ensure the test of A,2 includes B,1
+        # and C,2 (not C,1 which would indicate stale data in the
+        # cache for B).
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project3', 'master', 'C')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        # B Depends-On: C
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, C.data['url'])
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,1 2,1 1,1')
+
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,1 2,1')
+
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,1')
+
+        C.addPatchset()
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,2')
+
+        A.addPatchset()
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,2 2,1 1,2')
+
+    def test_crd_check_unknown(self):
+        "Test unknown projects in independent pipeline"
+        self.init_repo("org/unknown", tag='init')
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/unknown', 'master', 'D')
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+
+        # Make sure zuul has seen an event on B.
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(B.reported, 0)
+
+    def test_crd_cycle_join(self):
+        "Test an updated change creates a cycle"
+        A = self.fake_gerrit.addFakeChange('org/project2', 'master', 'A')
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(A.reported, 1)
+
+        # Create B->A
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.data['url'])
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # Dep is there so zuul should have reported on B
+        self.assertEqual(B.reported, 1)
+
+        # Update A to add A->B (a cycle).
+        A.addPatchset()
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['url'])
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+
+        # Dependency cycle injected so zuul should not have reported again on A
+        self.assertEqual(A.reported, 1)
+
+        # Now if we update B to remove the depends-on, everything
+        # should be okay.  B; A->B
+
+        B.addPatchset()
+        B.data['commitMessage'] = '%s\n' % (B.subject,)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+
+        # Cycle was removed so now zuul should have reported again on A
+        self.assertEqual(A.reported, 2)
+
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(B.reported, 2)
diff --git a/tests/unit/test_gerrit_legacy_crd.py b/tests/unit/test_gerrit_legacy_crd.py
new file mode 100644
index 0000000..90c93ec
--- /dev/null
+++ b/tests/unit/test_gerrit_legacy_crd.py
@@ -0,0 +1,630 @@
+#!/usr/bin/env python
+
+# Copyright 2012 Hewlett-Packard Development Company, L.P.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+from tests.base import (
+    ZuulTestCase,
+    simple_layout,
+)
+
+
+class TestGerritLegacyCRD(ZuulTestCase):
+    tenant_config_file = 'config/single-tenant/main.yaml'
+
+    def test_crd_gate(self):
+        "Test cross-repo dependencies"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        AM2 = self.fake_gerrit.addFakeChange('org/project1', 'master', 'AM2')
+        AM1 = self.fake_gerrit.addFakeChange('org/project1', 'master', 'AM1')
+        AM2.setMerged()
+        AM1.setMerged()
+
+        BM2 = self.fake_gerrit.addFakeChange('org/project2', 'master', 'BM2')
+        BM1 = self.fake_gerrit.addFakeChange('org/project2', 'master', 'BM1')
+        BM2.setMerged()
+        BM1.setMerged()
+
+        # A -> AM1 -> AM2
+        # B -> BM1 -> BM2
+        # A Depends-On: B
+        # M2 is here to make sure it is never queried.  If it is, it
+        # means zuul is walking down the entire history of merged
+        # changes.
+
+        B.setDependsOn(BM1, 1)
+        BM1.setDependsOn(BM2, 1)
+
+        A.setDependsOn(AM1, 1)
+        AM1.setDependsOn(AM2, 1)
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+
+        for connection in self.connections.connections.values():
+            connection.maintainCache([])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(AM2.queried, 0)
+        self.assertEqual(BM2.queried, 0)
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 1,1')
+
+    def test_crd_branch(self):
+        "Test cross-repo dependencies in multiple branches"
+
+        self.create_branch('org/project2', 'mp')
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C1 = self.fake_gerrit.addFakeChange('org/project2', 'mp', 'C1')
+        C2 = self.fake_gerrit.addFakeChange('org/project2', 'mp', 'C2',
+                                            status='ABANDONED')
+        C1.data['id'] = B.data['id']
+        C2.data['id'] = B.data['id']
+
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+        C1.addApproval('Code-Review', 2)
+
+        # A Depends-On: B+C1
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        C1.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(C1.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+        self.assertEqual(C1.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 3,1 1,1')
+
+    def test_crd_multiline(self):
+        "Test multiple depends-on lines in commit"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+        C.addApproval('Code-Review', 2)
+
+        # A Depends-On: B+C
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\nDepends-On: %s\n' % (
+            A.subject, B.data['id'], C.data['id'])
+
+        self.executor_server.hold_jobs_in_build = True
+        B.addApproval('Approved', 1)
+        C.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(C.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+        self.assertEqual(C.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 3,1 1,1')
+
+    def test_crd_unshared_gate(self):
+        "Test cross-repo dependencies in unshared gate queues"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        # A and B do not share a queue, make sure that A is unable to
+        # enqueue B (and therefore, A is unable to be enqueued).
+        B.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(B.reported, 0)
+        self.assertEqual(len(self.history), 0)
+
+        # Enqueue and merge B alone.
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(B.reported, 2)
+
+        # Now that B is merged, A should be able to be enqueued and
+        # merged.
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+
+    def test_crd_gate_reverse(self):
+        "Test reverse cross-repo dependencies"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+
+        self.executor_server.hold_jobs_in_build = True
+        A.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.reported, 2)
+
+        changes = self.getJobFromHistory(
+            'project-merge', 'org/project1').changes
+        self.assertEqual(changes, '2,1 1,1')
+
+    def test_crd_cycle(self):
+        "Test cross-repo dependency cycles"
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+        B.addApproval('Approved', 1)
+
+        # A -> B -> A (via commit-depends)
+
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.data['id'])
+
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(B.reported, 0)
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+
+    def test_crd_gate_unknown(self):
+        "Test unknown projects in dependent pipeline"
+        self.init_repo("org/unknown", tag='init')
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/unknown', 'master', 'B')
+        A.addApproval('Code-Review', 2)
+        B.addApproval('Code-Review', 2)
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        B.addApproval('Approved', 1)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        # Unknown projects cannot share a queue with any other
+        # since they don't have common jobs with any other (they have no jobs).
+        # Changes which depend on unknown project changes
+        # should not be processed in dependent pipeline
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 0)
+        self.assertEqual(B.reported, 0)
+        self.assertEqual(len(self.history), 0)
+
+        # Simulate change B being gated outside this layout Set the
+        # change merged before submitting the event so that when the
+        # event triggers a gerrit query to update the change, we get
+        # the information that it was merged.
+        B.setMerged()
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 0)
+
+        # Now that B is merged, A should be able to be enqueued and
+        # merged.
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'MERGED')
+        self.assertEqual(A.reported, 2)
+        self.assertEqual(B.data['status'], 'MERGED')
+        self.assertEqual(B.reported, 0)
+
+    def test_crd_check(self):
+        "Test cross-repo dependencies in independent pipelines"
+
+        self.executor_server.hold_jobs_in_build = True
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.executor_server.release('.*-merge')
+        self.waitUntilSettled()
+
+        self.assertTrue(self.builds[0].hasChanges(A, B))
+
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 0)
+
+        self.assertEqual(self.history[0].changes, '2,1 1,1')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_git_depends(self):
+        "Test single-repo dependencies in independent pipelines"
+        self.gearman_server.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+
+        # Add two git-dependent changes and make sure they both report
+        # success.
+        B.setDependsOn(A, 1)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.orderedRelease()
+        self.gearman_server.hold_jobs_in_build = False
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 1)
+
+        self.assertEqual(self.history[0].changes, '1,1')
+        self.assertEqual(self.history[-1].changes, '1,1 2,1')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+        self.assertIn('Build succeeded', A.messages[0])
+        self.assertIn('Build succeeded', B.messages[0])
+
+    def test_crd_check_duplicate(self):
+        "Test duplicate check in independent pipelines"
+        self.executor_server.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        check_pipeline = tenant.layout.pipelines['check']
+
+        # Add two git-dependent changes...
+        B.setDependsOn(A, 1)
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...make sure the live one is not duplicated...
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 2)
+
+        # ...but the non-live one is able to be.
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(check_pipeline.getAllItems()), 3)
+
+        # Release jobs in order to avoid races with change A jobs
+        # finishing before change B jobs.
+        self.orderedRelease()
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 1)
+
+        self.assertEqual(self.history[0].changes, '1,1 2,1')
+        self.assertEqual(self.history[1].changes, '1,1')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+        self.assertIn('Build succeeded', A.messages[0])
+        self.assertIn('Build succeeded', B.messages[0])
+
+    def _test_crd_check_reconfiguration(self, project1, project2):
+        "Test cross-repo dependencies re-enqueued in independent pipelines"
+
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange(project1, 'master', 'A')
+        B = self.fake_gerrit.addFakeChange(project2, 'master', 'B')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.sched.reconfigure(self.config)
+
+        # Make sure the items still share a change queue, and the
+        # first one is not live.
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 1)
+        queue = tenant.layout.pipelines['check'].queues[0]
+        first_item = queue.queue[0]
+        for item in queue.queue:
+            self.assertEqual(item.queue, first_item.queue)
+        self.assertFalse(first_item.live)
+        self.assertTrue(queue.queue[1].live)
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 0)
+
+        self.assertEqual(self.history[0].changes, '2,1 1,1')
+        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
+
+    def test_crd_check_reconfiguration(self):
+        self._test_crd_check_reconfiguration('org/project1', 'org/project2')
+
+    def test_crd_undefined_project(self):
+        """Test that undefined projects in dependencies are handled for
+        independent pipelines"""
+        # It's a hack for fake gerrit,
+        # as it implies repo creation upon the creation of any change
+        self.init_repo("org/unknown", tag='init')
+        self._test_crd_check_reconfiguration('org/project1', 'org/unknown')
+
+    @simple_layout('layouts/ignore-dependencies.yaml')
+    def test_crd_check_ignore_dependencies(self):
+        "Test cross-repo dependencies can be ignored"
+
+        self.gearman_server.hold_jobs_in_queue = True
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+        # C git-depends on B
+        C.setDependsOn(B, 1)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # Make sure none of the items share a change queue, and all
+        # are live.
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        check_pipeline = tenant.layout.pipelines['check']
+        self.assertEqual(len(check_pipeline.queues), 3)
+        self.assertEqual(len(check_pipeline.getAllItems()), 3)
+        for item in check_pipeline.getAllItems():
+            self.assertTrue(item.live)
+
+        self.gearman_server.hold_jobs_in_queue = False
+        self.gearman_server.release()
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(C.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.reported, 1)
+        self.assertEqual(C.reported, 1)
+
+        # Each job should have tested exactly one change
+        for job in self.history:
+            self.assertEqual(len(job.changes.split()), 1)
+
+    @simple_layout('layouts/three-projects.yaml')
+    def test_crd_check_transitive(self):
+        "Test transitive cross-repo dependencies"
+        # Specifically, if A -> B -> C, and C gets a new patchset and
+        # A gets a new patchset, ensure the test of A,2 includes B,1
+        # and C,2 (not C,1 which would indicate stale data in the
+        # cache for B).
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
+        C = self.fake_gerrit.addFakeChange('org/project3', 'master', 'C')
+
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        # B Depends-On: C
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, C.data['id'])
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,1 2,1 1,1')
+
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,1 2,1')
+
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,1')
+
+        C.addPatchset()
+        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,2')
+
+        A.addPatchset()
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(self.history[-1].changes, '3,2 2,1 1,2')
+
+    def test_crd_check_unknown(self):
+        "Test unknown projects in independent pipeline"
+        self.init_repo("org/unknown", tag='init')
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
+        B = self.fake_gerrit.addFakeChange('org/unknown', 'master', 'D')
+        # A Depends-On: B
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+
+        # Make sure zuul has seen an event on B.
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(B.reported, 0)
+
+    def test_crd_cycle_join(self):
+        "Test an updated change creates a cycle"
+        A = self.fake_gerrit.addFakeChange('org/project2', 'master', 'A')
+
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(A.reported, 1)
+
+        # Create B->A
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            B.subject, A.data['id'])
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # Dep is there so zuul should have reported on B
+        self.assertEqual(B.reported, 1)
+
+        # Update A to add A->B (a cycle).
+        A.addPatchset()
+        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
+            A.subject, B.data['id'])
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+
+        # Dependency cycle injected so zuul should not have reported again on A
+        self.assertEqual(A.reported, 1)
+
+        # Now if we update B to remove the depends-on, everything
+        # should be okay.  B; A->B
+
+        B.addPatchset()
+        B.data['commitMessage'] = '%s\n' % (B.subject,)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+
+        # Cycle was removed so now zuul should have reported again on A
+        self.assertEqual(A.reported, 2)
+
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(2))
+        self.waitUntilSettled()
+        self.assertEqual(B.reported, 2)
diff --git a/tests/unit/test_git_driver.py b/tests/unit/test_git_driver.py
index 1cfadf4..e8762d0 100644
--- a/tests/unit/test_git_driver.py
+++ b/tests/unit/test_git_driver.py
@@ -12,18 +12,28 @@
 # License for the specific language governing permissions and limitations
 # under the License.
 
-from tests.base import ZuulTestCase
+
+import os
+import time
+import yaml
+
+from tests.base import ZuulTestCase, simple_layout
 
 
 class TestGitDriver(ZuulTestCase):
     config_file = 'zuul-git-driver.conf'
     tenant_config_file = 'config/git-driver/main.yaml'
 
+    def setUp(self):
+        super(TestGitDriver, self).setUp()
+        self.git_connection = self.sched.connections.getSource('git').\
+            connection
+
     def setup_config(self):
         super(TestGitDriver, self).setup_config()
         self.config.set('connection git', 'baseurl', self.upstream_root)
 
-    def test_git_driver(self):
+    def test_basic(self):
         tenant = self.sched.abide.tenants.get('tenant-one')
         # Check that we have the git source for common-config and the
         # gerrit source for the project.
@@ -40,3 +50,137 @@
         self.waitUntilSettled()
         self.assertEqual(len(self.history), 1)
         self.assertEqual(A.reported, 1)
+
+    def test_config_refreshed(self):
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 1)
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(self.history[0].name, 'project-test1')
+
+        # Update zuul.yaml to force a tenant reconfiguration
+        path = os.path.join(self.upstream_root, 'common-config', 'zuul.yaml')
+        config = yaml.load(open(path, 'r').read())
+        change = {
+            'name': 'org/project',
+            'check': {
+                'jobs': [
+                    'project-test2'
+                ]
+            }
+        }
+        config[4]['project'] = change
+        files = {'zuul.yaml': yaml.dump(config)}
+        self.addCommitToRepo(
+            'common-config', 'Change zuul.yaml configuration', files)
+
+        # Wait for the tenant reconfiguration to happen
+        count = self.waitForEvent()
+        self.waitUntilSettled()
+
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 2)
+        self.assertEqual(A.reported, 1)
+        # We make sure the new job has run
+        self.assertEqual(self.history[1].name, 'project-test2')
+
+        # Let's stop the git Watcher to let us merge some changes commits
+        # We want to verify that config changes are detected for commits
+        # on the range oldrev..newrev
+        self.sched.connections.getSource('git').connection.w_pause = True
+        # Add a config change
+        change = {
+            'name': 'org/project',
+            'check': {
+                'jobs': [
+                    'project-test1'
+                ]
+            }
+        }
+        config[4]['project'] = change
+        files = {'zuul.yaml': yaml.dump(config)}
+        self.addCommitToRepo(
+            'common-config', 'Change zuul.yaml configuration', files)
+        # Add two other changes
+        self.addCommitToRepo(
+            'common-config', 'Adding f1',
+            {'f1': "Content"})
+        self.addCommitToRepo(
+            'common-config', 'Adding f2',
+            {'f2': "Content"})
+        # Restart the git watcher
+        self.sched.connections.getSource('git').connection.w_pause = False
+
+        # Wait for the tenant reconfiguration to happen
+        self.waitForEvent(count)
+        self.waitUntilSettled()
+
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 3)
+        self.assertEqual(A.reported, 1)
+        # We make sure the new job has run
+        self.assertEqual(self.history[2].name, 'project-test1')
+
+    def ensure_watcher_has_context(self):
+        # Make sure watcher have read initial refs shas
+        delay = 0.1
+        max_delay = 1
+        while not self.git_connection.projects_refs:
+            time.sleep(delay)
+            max_delay -= delay
+            if max_delay <= 0:
+                raise Exception("Timeout waiting for initial read")
+        return self.git_connection.watcher_thread._event_count
+
+    def waitForEvent(self, initial_count=0):
+        delay = 0.1
+        max_delay = 1
+        while self.git_connection.watcher_thread._event_count <= initial_count:
+            time.sleep(delay)
+            max_delay -= delay
+            if max_delay <= 0:
+                raise Exception("Timeout waiting for event")
+        return self.git_connection.watcher_thread._event_count
+
+    @simple_layout('layouts/basic-git.yaml', driver='git')
+    def test_ref_updated_event(self):
+        count = self.ensure_watcher_has_context()
+        # Add a commit to trigger a ref-updated event
+        self.addCommitToRepo(
+            'org/project', 'A change for ref-updated', {'f1': 'Content'})
+        # Wait for the git watcher to detect the ref-update event
+        self.waitForEvent(count)
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 1)
+        self.assertEqual('SUCCESS',
+                         self.getJobFromHistory('post-job').result)
+
+    @simple_layout('layouts/basic-git.yaml', driver='git')
+    def test_ref_created(self):
+        count = self.ensure_watcher_has_context()
+        # Tag HEAD to trigger a ref-updated event
+        self.addTagToRepo(
+            'org/project', 'atag', 'HEAD')
+        # Wait for the git watcher to detect the ref-update event
+        self.waitForEvent(count)
+        self.waitUntilSettled()
+        self.assertEqual(len(self.history), 1)
+        self.assertEqual('SUCCESS',
+                         self.getJobFromHistory('tag-job').result)
+
+    @simple_layout('layouts/basic-git.yaml', driver='git')
+    def test_ref_deleted(self):
+        count = self.ensure_watcher_has_context()
+        # Delete default tag init to trigger a ref-updated event
+        self.delTagFromRepo(
+            'org/project', 'init')
+        # Wait for the git watcher to detect the ref-update event
+        self.waitForEvent(count)
+        self.waitUntilSettled()
+        # Make sure no job as run as ignore-delete is True by default
+        self.assertEqual(len(self.history), 0)
diff --git a/tests/unit/test_github_driver.py b/tests/unit/test_github_driver.py
index ebb5e1c..3942b0b 100644
--- a/tests/unit/test_github_driver.py
+++ b/tests/unit/test_github_driver.py
@@ -50,6 +50,12 @@
         self.assertEqual(str(A.head_sha), zuulvars['patchset'])
         self.assertEqual('master', zuulvars['branch'])
         self.assertEqual(1, len(A.comments))
+        self.assertThat(
+            A.comments[0],
+            MatchesRegex('.*\[project-test1 \]\(.*\).*', re.DOTALL))
+        self.assertThat(
+            A.comments[0],
+            MatchesRegex('.*\[project-test2 \]\(.*\).*', re.DOTALL))
         self.assertEqual(2, len(self.history))
 
         # test_pull_unmatched_branch_event(self):
@@ -243,19 +249,28 @@
     @simple_layout('layouts/basic-github.yaml', driver='github')
     def test_git_https_url(self):
         """Test that git_ssh option gives git url with ssh"""
-        url = self.fake_github.real_getGitUrl('org/project')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        _, project = tenant.getProject('org/project')
+
+        url = self.fake_github.real_getGitUrl(project)
         self.assertEqual('https://github.com/org/project', url)
 
     @simple_layout('layouts/basic-github.yaml', driver='github')
     def test_git_ssh_url(self):
         """Test that git_ssh option gives git url with ssh"""
-        url = self.fake_github_ssh.real_getGitUrl('org/project')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        _, project = tenant.getProject('org/project')
+
+        url = self.fake_github_ssh.real_getGitUrl(project)
         self.assertEqual('ssh://git@github.com/org/project.git', url)
 
     @simple_layout('layouts/basic-github.yaml', driver='github')
     def test_git_enterprise_url(self):
         """Test that git_url option gives git url with proper host"""
-        url = self.fake_github_ent.real_getGitUrl('org/project')
+        tenant = self.sched.abide.tenants.get('tenant-one')
+        _, project = tenant.getProject('org/project')
+
+        url = self.fake_github_ent.real_getGitUrl(project)
         self.assertEqual('ssh://git@github.enterprise.io/org/project.git', url)
 
     @simple_layout('layouts/reporting-github.yaml', driver='github')
diff --git a/tests/unit/test_inventory.py b/tests/unit/test_inventory.py
index 1c41f5f..b7e35eb 100644
--- a/tests/unit/test_inventory.py
+++ b/tests/unit/test_inventory.py
@@ -37,6 +37,12 @@
         inv_path = os.path.join(build.jobdir.root, 'ansible', 'inventory.yaml')
         return yaml.safe_load(open(inv_path, 'r'))
 
+    def _get_setup_inventory(self, name):
+        build = self.getBuildByName(name)
+        setup_inv_path = os.path.join(build.jobdir.root, 'ansible',
+                                      'setup-inventory.yaml')
+        return yaml.safe_load(open(setup_inv_path, 'r'))
+
     def test_single_inventory(self):
 
         inventory = self._get_build_inventory('single-inventory')
@@ -119,5 +125,35 @@
             self.assertEqual(
                 inventory['all']['hosts'][node_name]['ansible_user'], username)
 
+            # check if the nodes use the correct or no ansible_connection
+            if node_name == 'windows':
+                self.assertEqual(
+                    inventory['all']['hosts'][node_name]['ansible_connection'],
+                    'winrm')
+            else:
+                self.assertEqual(
+                    'local',
+                    inventory['all']['hosts'][node_name]['ansible_connection'])
+
+        self.executor_server.release()
+        self.waitUntilSettled()
+
+    def test_setup_inventory(self):
+
+        setup_inventory = self._get_setup_inventory('hostvars-inventory')
+        inventory = self._get_build_inventory('hostvars-inventory')
+
+        self.assertIn('all', inventory)
+        self.assertIn('hosts', inventory['all'])
+
+        self.assertIn('default', setup_inventory['all']['hosts'])
+        self.assertIn('fakeuser', setup_inventory['all']['hosts'])
+        self.assertIn('windows', setup_inventory['all']['hosts'])
+        self.assertNotIn('network', setup_inventory['all']['hosts'])
+        self.assertIn('default', inventory['all']['hosts'])
+        self.assertIn('fakeuser', inventory['all']['hosts'])
+        self.assertIn('windows', inventory['all']['hosts'])
+        self.assertIn('network', inventory['all']['hosts'])
+
         self.executor_server.release()
         self.waitUntilSettled()
diff --git a/tests/unit/test_scheduler.py b/tests/unit/test_scheduler.py
index aacc81e..5db20b3 100755
--- a/tests/unit/test_scheduler.py
+++ b/tests/unit/test_scheduler.py
@@ -4196,7 +4196,7 @@
         running_item = running_items[0]
         self.assertEqual([], running_item['failing_reasons'])
         self.assertEqual([], running_item['items_behind'])
-        self.assertEqual('https://hostname/1', running_item['url'])
+        self.assertEqual('https://review.example.com/1', running_item['url'])
         self.assertIsNone(running_item['item_ahead'])
         self.assertEqual('org/project', running_item['project'])
         self.assertIsNone(running_item['remaining_time'])
@@ -4247,611 +4247,6 @@
             'SUCCESS')
         self.assertEqual(A.reported, 1)
 
-    def test_crd_gate(self):
-        "Test cross-repo dependencies"
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-        A.addApproval('Code-Review', 2)
-        B.addApproval('Code-Review', 2)
-
-        AM2 = self.fake_gerrit.addFakeChange('org/project1', 'master', 'AM2')
-        AM1 = self.fake_gerrit.addFakeChange('org/project1', 'master', 'AM1')
-        AM2.setMerged()
-        AM1.setMerged()
-
-        BM2 = self.fake_gerrit.addFakeChange('org/project2', 'master', 'BM2')
-        BM1 = self.fake_gerrit.addFakeChange('org/project2', 'master', 'BM1')
-        BM2.setMerged()
-        BM1.setMerged()
-
-        # A -> AM1 -> AM2
-        # B -> BM1 -> BM2
-        # A Depends-On: B
-        # M2 is here to make sure it is never queried.  If it is, it
-        # means zuul is walking down the entire history of merged
-        # changes.
-
-        B.setDependsOn(BM1, 1)
-        BM1.setDependsOn(BM2, 1)
-
-        A.setDependsOn(AM1, 1)
-        AM1.setDependsOn(AM2, 1)
-
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-
-        for connection in self.connections.connections.values():
-            connection.maintainCache([])
-
-        self.executor_server.hold_jobs_in_build = True
-        B.addApproval('Approved', 1)
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.hold_jobs_in_build = False
-        self.executor_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(AM2.queried, 0)
-        self.assertEqual(BM2.queried, 0)
-        self.assertEqual(A.data['status'], 'MERGED')
-        self.assertEqual(B.data['status'], 'MERGED')
-        self.assertEqual(A.reported, 2)
-        self.assertEqual(B.reported, 2)
-
-        changes = self.getJobFromHistory(
-            'project-merge', 'org/project1').changes
-        self.assertEqual(changes, '2,1 1,1')
-
-    def test_crd_branch(self):
-        "Test cross-repo dependencies in multiple branches"
-
-        self.create_branch('org/project2', 'mp')
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-        C1 = self.fake_gerrit.addFakeChange('org/project2', 'mp', 'C1')
-        C2 = self.fake_gerrit.addFakeChange('org/project2', 'mp', 'C2',
-                                            status='ABANDONED')
-        C1.data['id'] = B.data['id']
-        C2.data['id'] = B.data['id']
-
-        A.addApproval('Code-Review', 2)
-        B.addApproval('Code-Review', 2)
-        C1.addApproval('Code-Review', 2)
-
-        # A Depends-On: B+C1
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        self.executor_server.hold_jobs_in_build = True
-        B.addApproval('Approved', 1)
-        C1.addApproval('Approved', 1)
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.hold_jobs_in_build = False
-        self.executor_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'MERGED')
-        self.assertEqual(B.data['status'], 'MERGED')
-        self.assertEqual(C1.data['status'], 'MERGED')
-        self.assertEqual(A.reported, 2)
-        self.assertEqual(B.reported, 2)
-        self.assertEqual(C1.reported, 2)
-
-        changes = self.getJobFromHistory(
-            'project-merge', 'org/project1').changes
-        self.assertEqual(changes, '2,1 3,1 1,1')
-
-    def test_crd_multiline(self):
-        "Test multiple depends-on lines in commit"
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
-        A.addApproval('Code-Review', 2)
-        B.addApproval('Code-Review', 2)
-        C.addApproval('Code-Review', 2)
-
-        # A Depends-On: B+C
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\nDepends-On: %s\n' % (
-            A.subject, B.data['id'], C.data['id'])
-
-        self.executor_server.hold_jobs_in_build = True
-        B.addApproval('Approved', 1)
-        C.addApproval('Approved', 1)
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.hold_jobs_in_build = False
-        self.executor_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'MERGED')
-        self.assertEqual(B.data['status'], 'MERGED')
-        self.assertEqual(C.data['status'], 'MERGED')
-        self.assertEqual(A.reported, 2)
-        self.assertEqual(B.reported, 2)
-        self.assertEqual(C.reported, 2)
-
-        changes = self.getJobFromHistory(
-            'project-merge', 'org/project1').changes
-        self.assertEqual(changes, '2,1 3,1 1,1')
-
-    def test_crd_unshared_gate(self):
-        "Test cross-repo dependencies in unshared gate queues"
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project', 'master', 'B')
-        A.addApproval('Code-Review', 2)
-        B.addApproval('Code-Review', 2)
-
-        # A Depends-On: B
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        # A and B do not share a queue, make sure that A is unable to
-        # enqueue B (and therefore, A is unable to be enqueued).
-        B.addApproval('Approved', 1)
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(A.reported, 0)
-        self.assertEqual(B.reported, 0)
-        self.assertEqual(len(self.history), 0)
-
-        # Enqueue and merge B alone.
-        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.assertEqual(B.data['status'], 'MERGED')
-        self.assertEqual(B.reported, 2)
-
-        # Now that B is merged, A should be able to be enqueued and
-        # merged.
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'MERGED')
-        self.assertEqual(A.reported, 2)
-
-    def test_crd_gate_reverse(self):
-        "Test reverse cross-repo dependencies"
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-        A.addApproval('Code-Review', 2)
-        B.addApproval('Code-Review', 2)
-
-        # A Depends-On: B
-
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-
-        self.executor_server.hold_jobs_in_build = True
-        A.addApproval('Approved', 1)
-        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-        self.executor_server.hold_jobs_in_build = False
-        self.executor_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'MERGED')
-        self.assertEqual(B.data['status'], 'MERGED')
-        self.assertEqual(A.reported, 2)
-        self.assertEqual(B.reported, 2)
-
-        changes = self.getJobFromHistory(
-            'project-merge', 'org/project1').changes
-        self.assertEqual(changes, '2,1 1,1')
-
-    def test_crd_cycle(self):
-        "Test cross-repo dependency cycles"
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-        A.addApproval('Code-Review', 2)
-        B.addApproval('Code-Review', 2)
-
-        # A -> B -> A (via commit-depends)
-
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            B.subject, A.data['id'])
-
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.assertEqual(A.reported, 0)
-        self.assertEqual(B.reported, 0)
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-
-    def test_crd_gate_unknown(self):
-        "Test unknown projects in dependent pipeline"
-        self.init_repo("org/unknown", tag='init')
-        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/unknown', 'master', 'B')
-        A.addApproval('Code-Review', 2)
-        B.addApproval('Code-Review', 2)
-
-        # A Depends-On: B
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        B.addApproval('Approved', 1)
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        # Unknown projects cannot share a queue with any other
-        # since they don't have common jobs with any other (they have no jobs).
-        # Changes which depend on unknown project changes
-        # should not be processed in dependent pipeline
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(A.reported, 0)
-        self.assertEqual(B.reported, 0)
-        self.assertEqual(len(self.history), 0)
-
-        # Simulate change B being gated outside this layout Set the
-        # change merged before submitting the event so that when the
-        # event triggers a gerrit query to update the change, we get
-        # the information that it was merged.
-        B.setMerged()
-        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
-        self.waitUntilSettled()
-        self.assertEqual(len(self.history), 0)
-
-        # Now that B is merged, A should be able to be enqueued and
-        # merged.
-        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'MERGED')
-        self.assertEqual(A.reported, 2)
-        self.assertEqual(B.data['status'], 'MERGED')
-        self.assertEqual(B.reported, 0)
-
-    def test_crd_check(self):
-        "Test cross-repo dependencies in independent pipelines"
-
-        self.executor_server.hold_jobs_in_build = True
-        self.gearman_server.hold_jobs_in_queue = True
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-
-        # A Depends-On: B
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-
-        self.gearman_server.hold_jobs_in_queue = False
-        self.gearman_server.release()
-        self.waitUntilSettled()
-
-        self.executor_server.release('.*-merge')
-        self.waitUntilSettled()
-
-        self.assertTrue(self.builds[0].hasChanges(A, B))
-
-        self.executor_server.hold_jobs_in_build = False
-        self.executor_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(A.reported, 1)
-        self.assertEqual(B.reported, 0)
-
-        self.assertEqual(self.history[0].changes, '2,1 1,1')
-        tenant = self.sched.abide.tenants.get('tenant-one')
-        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
-
-    def test_crd_check_git_depends(self):
-        "Test single-repo dependencies in independent pipelines"
-        self.gearman_server.hold_jobs_in_build = True
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
-
-        # Add two git-dependent changes and make sure they both report
-        # success.
-        B.setDependsOn(A, 1)
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-
-        self.orderedRelease()
-        self.gearman_server.hold_jobs_in_build = False
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(A.reported, 1)
-        self.assertEqual(B.reported, 1)
-
-        self.assertEqual(self.history[0].changes, '1,1')
-        self.assertEqual(self.history[-1].changes, '1,1 2,1')
-        tenant = self.sched.abide.tenants.get('tenant-one')
-        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
-
-        self.assertIn('Build succeeded', A.messages[0])
-        self.assertIn('Build succeeded', B.messages[0])
-
-    def test_crd_check_duplicate(self):
-        "Test duplicate check in independent pipelines"
-        self.executor_server.hold_jobs_in_build = True
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
-        tenant = self.sched.abide.tenants.get('tenant-one')
-        check_pipeline = tenant.layout.pipelines['check']
-
-        # Add two git-dependent changes...
-        B.setDependsOn(A, 1)
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.assertEqual(len(check_pipeline.getAllItems()), 2)
-
-        # ...make sure the live one is not duplicated...
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.assertEqual(len(check_pipeline.getAllItems()), 2)
-
-        # ...but the non-live one is able to be.
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.assertEqual(len(check_pipeline.getAllItems()), 3)
-
-        # Release jobs in order to avoid races with change A jobs
-        # finishing before change B jobs.
-        self.orderedRelease()
-        self.executor_server.hold_jobs_in_build = False
-        self.executor_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(A.reported, 1)
-        self.assertEqual(B.reported, 1)
-
-        self.assertEqual(self.history[0].changes, '1,1 2,1')
-        self.assertEqual(self.history[1].changes, '1,1')
-        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
-
-        self.assertIn('Build succeeded', A.messages[0])
-        self.assertIn('Build succeeded', B.messages[0])
-
-    def _test_crd_check_reconfiguration(self, project1, project2):
-        "Test cross-repo dependencies re-enqueued in independent pipelines"
-
-        self.gearman_server.hold_jobs_in_queue = True
-        A = self.fake_gerrit.addFakeChange(project1, 'master', 'A')
-        B = self.fake_gerrit.addFakeChange(project2, 'master', 'B')
-
-        # A Depends-On: B
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-
-        self.sched.reconfigure(self.config)
-
-        # Make sure the items still share a change queue, and the
-        # first one is not live.
-        tenant = self.sched.abide.tenants.get('tenant-one')
-        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 1)
-        queue = tenant.layout.pipelines['check'].queues[0]
-        first_item = queue.queue[0]
-        for item in queue.queue:
-            self.assertEqual(item.queue, first_item.queue)
-        self.assertFalse(first_item.live)
-        self.assertTrue(queue.queue[1].live)
-
-        self.gearman_server.hold_jobs_in_queue = False
-        self.gearman_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(A.reported, 1)
-        self.assertEqual(B.reported, 0)
-
-        self.assertEqual(self.history[0].changes, '2,1 1,1')
-        self.assertEqual(len(tenant.layout.pipelines['check'].queues), 0)
-
-    def test_crd_check_reconfiguration(self):
-        self._test_crd_check_reconfiguration('org/project1', 'org/project2')
-
-    def test_crd_undefined_project(self):
-        """Test that undefined projects in dependencies are handled for
-        independent pipelines"""
-        # It's a hack for fake gerrit,
-        # as it implies repo creation upon the creation of any change
-        self.init_repo("org/unknown", tag='init')
-        self._test_crd_check_reconfiguration('org/project1', 'org/unknown')
-
-    @simple_layout('layouts/ignore-dependencies.yaml')
-    def test_crd_check_ignore_dependencies(self):
-        "Test cross-repo dependencies can be ignored"
-
-        self.gearman_server.hold_jobs_in_queue = True
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-        C = self.fake_gerrit.addFakeChange('org/project2', 'master', 'C')
-
-        # A Depends-On: B
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-        # C git-depends on B
-        C.setDependsOn(B, 1)
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-
-        # Make sure none of the items share a change queue, and all
-        # are live.
-        tenant = self.sched.abide.tenants.get('tenant-one')
-        check_pipeline = tenant.layout.pipelines['check']
-        self.assertEqual(len(check_pipeline.queues), 3)
-        self.assertEqual(len(check_pipeline.getAllItems()), 3)
-        for item in check_pipeline.getAllItems():
-            self.assertTrue(item.live)
-
-        self.gearman_server.hold_jobs_in_queue = False
-        self.gearman_server.release()
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(C.data['status'], 'NEW')
-        self.assertEqual(A.reported, 1)
-        self.assertEqual(B.reported, 1)
-        self.assertEqual(C.reported, 1)
-
-        # Each job should have tested exactly one change
-        for job in self.history:
-            self.assertEqual(len(job.changes.split()), 1)
-
-    @simple_layout('layouts/three-projects.yaml')
-    def test_crd_check_transitive(self):
-        "Test transitive cross-repo dependencies"
-        # Specifically, if A -> B -> C, and C gets a new patchset and
-        # A gets a new patchset, ensure the test of A,2 includes B,1
-        # and C,2 (not C,1 which would indicate stale data in the
-        # cache for B).
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/project2', 'master', 'B')
-        C = self.fake_gerrit.addFakeChange('org/project3', 'master', 'C')
-
-        # A Depends-On: B
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        # B Depends-On: C
-        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            B.subject, C.data['id'])
-
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.assertEqual(self.history[-1].changes, '3,1 2,1 1,1')
-
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.assertEqual(self.history[-1].changes, '3,1 2,1')
-
-        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.assertEqual(self.history[-1].changes, '3,1')
-
-        C.addPatchset()
-        self.fake_gerrit.addEvent(C.getPatchsetCreatedEvent(2))
-        self.waitUntilSettled()
-        self.assertEqual(self.history[-1].changes, '3,2')
-
-        A.addPatchset()
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
-        self.waitUntilSettled()
-        self.assertEqual(self.history[-1].changes, '3,2 2,1 1,2')
-
-    def test_crd_check_unknown(self):
-        "Test unknown projects in independent pipeline"
-        self.init_repo("org/unknown", tag='init')
-        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
-        B = self.fake_gerrit.addFakeChange('org/unknown', 'master', 'D')
-        # A Depends-On: B
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-
-        # Make sure zuul has seen an event on B.
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-
-        self.assertEqual(A.data['status'], 'NEW')
-        self.assertEqual(A.reported, 1)
-        self.assertEqual(B.data['status'], 'NEW')
-        self.assertEqual(B.reported, 0)
-
-    def test_crd_cycle_join(self):
-        "Test an updated change creates a cycle"
-        A = self.fake_gerrit.addFakeChange('org/project2', 'master', 'A')
-
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-        self.assertEqual(A.reported, 1)
-
-        # Create B->A
-        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
-        B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            B.subject, A.data['id'])
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
-        self.waitUntilSettled()
-
-        # Dep is there so zuul should have reported on B
-        self.assertEqual(B.reported, 1)
-
-        # Update A to add A->B (a cycle).
-        A.addPatchset()
-        A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
-            A.subject, B.data['id'])
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
-        self.waitUntilSettled()
-
-        # Dependency cycle injected so zuul should not have reported again on A
-        self.assertEqual(A.reported, 1)
-
-        # Now if we update B to remove the depends-on, everything
-        # should be okay.  B; A->B
-
-        B.addPatchset()
-        B.data['commitMessage'] = '%s\n' % (B.subject,)
-        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(2))
-        self.waitUntilSettled()
-
-        # Cycle was removed so now zuul should have reported again on A
-        self.assertEqual(A.reported, 2)
-
-        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(2))
-        self.waitUntilSettled()
-        self.assertEqual(B.reported, 2)
-
     @simple_layout('layouts/disable_at.yaml')
     def test_disable_at(self):
         "Test a pipeline will only report to the disabled trigger when failing"
@@ -6070,6 +5465,77 @@
         self.assertEqual(B.reported, 1)
 
 
+class TestImplicitProject(ZuulTestCase):
+    tenant_config_file = 'config/implicit-project/main.yaml'
+
+    def test_implicit_project(self):
+        # config project should work with implicit project name
+        A = self.fake_gerrit.addFakeChange('common-config', 'master', 'A')
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+
+        # untrusted project should work with implicit project name
+        B = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
+
+        self.waitUntilSettled()
+
+        self.assertEqual(A.data['status'], 'NEW')
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(B.data['status'], 'NEW')
+        self.assertEqual(B.reported, 1)
+        self.assertHistory([
+            dict(name='test-common', result='SUCCESS', changes='1,1'),
+            dict(name='test-common', result='SUCCESS', changes='2,1'),
+            dict(name='test-project', result='SUCCESS', changes='2,1'),
+        ], ordered=False)
+
+        # now test adding a further project in repo
+        in_repo_conf = textwrap.dedent(
+            """
+            - job:
+                name: test-project
+                run: playbooks/test-project.yaml
+            - job:
+                name: test2-project
+                run: playbooks/test-project.yaml
+
+            - project:
+                check:
+                  jobs:
+                    - test-project
+                gate:
+                  jobs:
+                    - test-project
+
+            - project:
+                check:
+                  jobs:
+                    - test2-project
+                gate:
+                  jobs:
+                    - test2-project
+
+            """)
+        file_dict = {'.zuul.yaml': in_repo_conf}
+        C = self.fake_gerrit.addFakeChange('org/project', 'master', 'A',
+                                           files=file_dict)
+        C.addApproval('Code-Review', 2)
+        self.fake_gerrit.addEvent(C.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        # change C must be merged
+        self.assertEqual(C.data['status'], 'MERGED')
+        self.assertEqual(C.reported, 2)
+        self.assertHistory([
+            dict(name='test-common', result='SUCCESS', changes='1,1'),
+            dict(name='test-common', result='SUCCESS', changes='2,1'),
+            dict(name='test-project', result='SUCCESS', changes='2,1'),
+            dict(name='test-common', result='SUCCESS', changes='3,1'),
+            dict(name='test-project', result='SUCCESS', changes='3,1'),
+            dict(name='test2-project', result='SUCCESS', changes='3,1'),
+        ], ordered=False)
+
+
 class TestSemaphoreInRepo(ZuulTestCase):
     config_file = 'zuul-connections-gerrit-and-github.conf'
     tenant_config_file = 'config/in-repo/main.yaml'
diff --git a/tests/unit/test_streaming.py b/tests/unit/test_streaming.py
index 4bb541a..b999106 100644
--- a/tests/unit/test_streaming.py
+++ b/tests/unit/test_streaming.py
@@ -41,13 +41,13 @@
     def startStreamer(self, port, root=None):
         if not root:
             root = tempfile.gettempdir()
-        return zuul.lib.log_streamer.LogStreamer(None, self.host, port, root)
+        return zuul.lib.log_streamer.LogStreamer(self.host, port, root)
 
     def test_start_stop(self):
-        port = 7900
-        streamer = self.startStreamer(port)
+        streamer = self.startStreamer(0)
         self.addCleanup(streamer.stop)
 
+        port = streamer.server.socket.getsockname()[1]
         s = socket.create_connection((self.host, port))
         s.close()
 
@@ -77,12 +77,13 @@
     def startStreamer(self, port, build_uuid, root=None):
         if not root:
             root = tempfile.gettempdir()
-        self.streamer = zuul.lib.log_streamer.LogStreamer(None, self.host,
+        self.streamer = zuul.lib.log_streamer.LogStreamer(self.host,
                                                           port, root)
+        port = self.streamer.server.socket.getsockname()[1]
         s = socket.create_connection((self.host, port))
         self.addCleanup(s.close)
 
-        req = '%s\n' % build_uuid
+        req = '%s\r\n' % build_uuid
         s.sendall(req.encode('utf-8'))
         self.test_streaming_event.set()
 
@@ -129,10 +130,9 @@
 
         # Create a thread to stream the log. We need this to be happening
         # before we create the flag file to tell the job to complete.
-        port = 7901
         streamer_thread = threading.Thread(
             target=self.startStreamer,
-            args=(port, build.uuid, self.executor_server.jobdir_root,)
+            args=(0, build.uuid, self.executor_server.jobdir_root,)
         )
         streamer_thread.start()
         self.addCleanup(self.stopStreamer)
@@ -196,7 +196,7 @@
                 time.sleep(0.1)
 
         with socket.create_connection(gateway_address) as s:
-            msg = "%s\n" % build_uuid
+            msg = "%s\r\n" % build_uuid
             s.sendall(msg.encode('utf-8'))
             event.set()  # notify we are connected and req sent
             while True:
@@ -209,7 +209,7 @@
     def test_websocket_streaming(self):
         # Start the finger streamer daemon
         streamer = zuul.lib.log_streamer.LogStreamer(
-            None, self.host, 0, self.executor_server.jobdir_root)
+            self.host, 0, self.executor_server.jobdir_root)
         self.addCleanup(streamer.stop)
 
         # Need to set the streaming port before submitting the job
@@ -294,7 +294,7 @@
     def test_finger_gateway(self):
         # Start the finger streamer daemon
         streamer = zuul.lib.log_streamer.LogStreamer(
-            None, self.host, 0, self.executor_server.jobdir_root)
+            self.host, 0, self.executor_server.jobdir_root)
         self.addCleanup(streamer.stop)
         finger_port = streamer.server.socket.getsockname()[1]
 
diff --git a/tests/unit/test_v3.py b/tests/unit/test_v3.py
index 44aa966..163a58b 100755
--- a/tests/unit/test_v3.py
+++ b/tests/unit/test_v3.py
@@ -73,6 +73,110 @@
                          "not affect tenant one")
 
 
+class TestProtected(ZuulTestCase):
+
+    tenant_config_file = 'config/protected/main.yaml'
+
+    def test_protected_ok(self):
+            # test clean usage of final parent job
+            in_repo_conf = textwrap.dedent(
+                """
+                - job:
+                    name: job-protected
+                    protected: true
+                    run: playbooks/job-protected.yaml
+
+                - project:
+                    name: org/project
+                    check:
+                      jobs:
+                        - job-child-ok
+
+                - job:
+                    name: job-child-ok
+                    parent: job-protected
+
+                - project:
+                    name: org/project
+                    check:
+                      jobs:
+                        - job-child-ok
+
+                """)
+
+            file_dict = {'zuul.yaml': in_repo_conf}
+            A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A',
+                                               files=file_dict)
+            self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+            self.waitUntilSettled()
+
+            self.assertEqual(A.reported, 1)
+            self.assertEqual(A.patchsets[-1]['approvals'][0]['value'], '1')
+
+    def test_protected_reset(self):
+        # try to reset protected flag
+        in_repo_conf = textwrap.dedent(
+            """
+            - job:
+                name: job-protected
+                protected: true
+                run: playbooks/job-protected.yaml
+
+            - job:
+                name: job-child-reset-protected
+                parent: job-protected
+                protected: false
+
+            - project:
+                name: org/project
+                check:
+                  jobs:
+                    - job-child-reset-protected
+
+            """)
+
+        file_dict = {'zuul.yaml': in_repo_conf}
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A',
+                                           files=file_dict)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        # The second patch tried to override some variables.
+        # Thus it should fail.
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(A.patchsets[-1]['approvals'][0]['value'], '-1')
+        self.assertIn('Unable to reset protected attribute', A.messages[0])
+
+    def test_protected_inherit_not_ok(self):
+        # try to inherit from a protected job in different project
+        in_repo_conf = textwrap.dedent(
+            """
+            - job:
+                name: job-child-notok
+                run: playbooks/job-child-notok.yaml
+                parent: job-protected
+
+            - project:
+                name: org/project1
+                check:
+                  jobs:
+                    - job-child-notok
+
+            """)
+
+        file_dict = {'zuul.yaml': in_repo_conf}
+        A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A',
+                                           files=file_dict)
+        self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
+        self.waitUntilSettled()
+
+        self.assertEqual(A.reported, 1)
+        self.assertEqual(A.patchsets[-1]['approvals'][0]['value'], '-1')
+        self.assertIn(
+            "which is defined in review.example.com/org/project is protected "
+            "and cannot be inherited from other projects.", A.messages[0])
+
+
 class TestFinal(ZuulTestCase):
 
     tenant_config_file = 'config/final/main.yaml'
@@ -543,11 +647,23 @@
                 name: project-test2
                 run: playbooks/project-test2.yaml
 
+            - job:
+                name: project-test3
+                run: playbooks/project-test2.yaml
+
+            # add a job by the short project name
             - project:
                 name: org/project
                 tenant-one-gate:
                   jobs:
                     - project-test2
+
+            # add a job by the canonical project name
+            - project:
+                name: review.example.com/org/project
+                tenant-one-gate:
+                  jobs:
+                    - project-test3
             """)
 
         in_repo_playbook = textwrap.dedent(
@@ -569,7 +685,9 @@
         self.assertIn('tenant-one-gate', A.messages[1],
                       "A should transit tenant-one gate")
         self.assertHistory([
-            dict(name='project-test2', result='SUCCESS', changes='1,1')])
+            dict(name='project-test2', result='SUCCESS', changes='1,1'),
+            dict(name='project-test3', result='SUCCESS', changes='1,1'),
+        ], ordered=False)
 
         self.fake_gerrit.addEvent(A.getChangeMergedEvent())
         self.waitUntilSettled()
@@ -584,7 +702,10 @@
                          'SUCCESS')
         self.assertHistory([
             dict(name='project-test2', result='SUCCESS', changes='1,1'),
-            dict(name='project-test2', result='SUCCESS', changes='2,1')])
+            dict(name='project-test3', result='SUCCESS', changes='1,1'),
+            dict(name='project-test2', result='SUCCESS', changes='2,1'),
+            dict(name='project-test3', result='SUCCESS', changes='2,1'),
+        ], ordered=False)
 
     def test_dynamic_template(self):
         # Tests that a project can't update a template in another
diff --git a/tests/unit/test_web.py b/tests/unit/test_web.py
new file mode 100644
index 0000000..6881a83
--- /dev/null
+++ b/tests/unit/test_web.py
@@ -0,0 +1,145 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Hewlett-Packard Development Company, L.P.
+# Copyright 2014 Rackspace Australia
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import asyncio
+import threading
+import os
+import json
+import urllib
+import time
+import socket
+from unittest import skip
+
+import webob
+
+import zuul.web
+
+from tests.base import ZuulTestCase, FIXTURE_DIR
+
+
+class TestWeb(ZuulTestCase):
+    tenant_config_file = 'config/single-tenant/main.yaml'
+
+    def setUp(self):
+        super(TestWeb, self).setUp()
+        self.executor_server.hold_jobs_in_build = True
+        A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
+        A.addApproval('Code-Review', 2)
+        self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
+        B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
+        B.addApproval('Code-Review', 2)
+        self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
+        self.waitUntilSettled()
+
+        # Start the web server
+        self.web = zuul.web.ZuulWeb(
+            listen_address='127.0.0.1', listen_port=0,
+            gear_server='127.0.0.1', gear_port=self.gearman_server.port)
+        loop = asyncio.new_event_loop()
+        loop.set_debug(True)
+        ws_thread = threading.Thread(target=self.web.run, args=(loop,))
+        ws_thread.start()
+        self.addCleanup(loop.close)
+        self.addCleanup(ws_thread.join)
+        self.addCleanup(self.web.stop)
+
+        self.host = 'localhost'
+        # Wait until web server is started
+        while True:
+            time.sleep(0.1)
+            if self.web.server is None:
+                continue
+            self.port = self.web.server.sockets[0].getsockname()[1]
+            print(self.host, self.port)
+            try:
+                with socket.create_connection((self.host, self.port)):
+                    break
+            except ConnectionRefusedError:
+                pass
+
+    def tearDown(self):
+        self.executor_server.hold_jobs_in_build = False
+        self.executor_server.release()
+        self.waitUntilSettled()
+        super(TestWeb, self).tearDown()
+
+    def test_web_status(self):
+        "Test that we can filter to only certain changes in the webapp."
+
+        req = urllib.request.Request(
+            "http://localhost:%s/tenant-one/status.json" % self.port)
+        f = urllib.request.urlopen(req)
+        data = json.loads(f.read().decode('utf8'))
+
+        self.assertIn('pipelines', data)
+
+    def test_web_bad_url(self):
+        # do we 404 correctly
+        req = urllib.request.Request(
+            "http://localhost:%s/status/foo" % self.port)
+        self.assertRaises(urllib.error.HTTPError, urllib.request.urlopen, req)
+
+    @skip("This is not supported by zuul-web")
+    def test_web_find_change(self):
+        # can we filter by change id
+        req = urllib.request.Request(
+            "http://localhost:%s/tenant-one/status/change/1,1" % self.port)
+        f = urllib.request.urlopen(req)
+        data = json.loads(f.read().decode('utf8'))
+
+        self.assertEqual(1, len(data), data)
+        self.assertEqual("org/project", data[0]['project'])
+
+        req = urllib.request.Request(
+            "http://localhost:%s/tenant-one/status/change/2,1" % self.port)
+        f = urllib.request.urlopen(req)
+        data = json.loads(f.read().decode('utf8'))
+
+        self.assertEqual(1, len(data), data)
+        self.assertEqual("org/project1", data[0]['project'], data)
+
+    def test_web_keys(self):
+        with open(os.path.join(FIXTURE_DIR, 'public.pem'), 'rb') as f:
+            public_pem = f.read()
+
+        req = urllib.request.Request(
+            "http://localhost:%s/tenant-one/org/project.pub" %
+            self.port)
+        f = urllib.request.urlopen(req)
+        self.assertEqual(f.read(), public_pem)
+
+    @skip("This may not apply to zuul-web")
+    def test_web_custom_handler(self):
+        def custom_handler(path, tenant_name, request):
+            return webob.Response(body='ok')
+
+        self.webapp.register_path('/custom', custom_handler)
+        req = urllib.request.Request(
+            "http://localhost:%s/custom" % self.port)
+        f = urllib.request.urlopen(req)
+        self.assertEqual(b'ok', f.read())
+
+        self.webapp.unregister_path('/custom')
+        self.assertRaises(urllib.error.HTTPError, urllib.request.urlopen, req)
+
+    @skip("This returns a 500")
+    def test_web_404_on_unknown_tenant(self):
+        req = urllib.request.Request(
+            "http://localhost:{}/non-tenant/status.json".format(self.port))
+        e = self.assertRaises(
+            urllib.error.HTTPError, urllib.request.urlopen, req)
+        self.assertEqual(404, e.code)
diff --git a/tests/unit/test_zuultrigger.py b/tests/unit/test_zuultrigger.py
index 3954a21..5575853 100644
--- a/tests/unit/test_zuultrigger.py
+++ b/tests/unit/test_zuultrigger.py
@@ -126,5 +126,5 @@
             "dependencies was unable to be automatically merged with the "
             "current state of its repository. Please rebase the change and "
             "upload a new patchset.")
-        self.assertEqual(self.fake_gerrit.queries[1],
-                         "project:org/project status:open")
+        self.assertIn("project:org/project status:open",
+                      self.fake_gerrit.queries)
diff --git a/tools/encrypt_secret.py b/tools/encrypt_secret.py
index c0ee9be..4cb1666 100755
--- a/tools/encrypt_secret.py
+++ b/tools/encrypt_secret.py
@@ -46,6 +46,8 @@
     # TODO(jeblair): Throw a fit if SSL is not used.
     parser.add_argument('project',
                         help="The name of the project.")
+    parser.add_argument('--strip', action='store_true', default=False,
+                        help="Strip whitespace from beginning/end of input.")
     parser.add_argument('--infile',
                         default=None,
                         help="A filename whose contents will be encrypted.  "
@@ -68,6 +70,8 @@
         plaintext = sys.stdin.read()
 
     plaintext = plaintext.encode("utf-8")
+    if args.strip:
+        plaintext = plaintext.strip()
 
     pubkey_file = tempfile.NamedTemporaryFile(delete=False)
     try:
diff --git a/tools/github-debugging.py b/tools/github-debugging.py
new file mode 100755
index 0000000..101fd11
--- /dev/null
+++ b/tools/github-debugging.py
@@ -0,0 +1,71 @@
+#!/usr/bin/env python3
+
+import logging
+
+from zuul.driver.github.githubconnection import GithubConnection
+from zuul.driver.github import GithubDriver
+from zuul.model import Change, Project
+
+# This is a template with boilerplate code for debugging github issues
+
+# TODO: for real use override the following variables
+server = 'github.com'
+api_token = 'xxxx'
+
+org = 'example'
+repo = 'sandbox'
+pull_nr = 8
+
+
+def configure_logging(context):
+    stream_handler = logging.StreamHandler()
+    logger = logging.getLogger(context)
+    logger.addHandler(stream_handler)
+    logger.setLevel(logging.DEBUG)
+
+
+# uncomment for more logging
+# configure_logging('urllib3')
+# configure_logging('github3')
+# configure_logging('cachecontrol')
+
+
+# This is all that's needed for getting a usable github connection
+def create_connection(server, api_token):
+    driver = GithubDriver()
+    connection_config = {
+        'server': server,
+        'api_token': api_token,
+    }
+    conn = GithubConnection(driver, 'github', connection_config)
+    conn._authenticateGithubAPI()
+    return conn
+
+
+def get_change(connection: GithubConnection,
+               org: str,
+               repo: str,
+               pull: int) -> Change:
+    p = Project("%s/%s" % (org, repo), connection.source)
+    github = connection.getGithubClient(p)
+    pr = github.pull_request(org, repo, pull)
+    sha = pr.head.sha
+    return conn._getChange(p, pull, sha, True)
+
+
+# create github connection
+conn = create_connection(server, api_token)
+
+
+# Now we can do anything we want with the connection, e.g. check canMerge for
+# a pull request.
+change = get_change(conn, org, repo, pull_nr)
+
+print(conn.canMerge(change, {'cc/gate2'}))
+
+
+# Or just use the github object.
+# github = conn.getGithubClient()
+#
+# repository = github.repository(org, repo)
+# print(repository.as_dict())
diff --git a/zuul/cmd/__init__.py b/zuul/cmd/__init__.py
index 2aad4eb..b299219 100755
--- a/zuul/cmd/__init__.py
+++ b/zuul/cmd/__init__.py
@@ -193,8 +193,9 @@
         else:
             # Exercise the pidfile before we do anything else (including
             # logging or daemonizing)
-            with daemon.DaemonContext(pidfile=pid):
+            with pid:
                 pass
+
             with daemon.DaemonContext(pidfile=pid):
                 self.run()
 
diff --git a/zuul/cmd/executor.py b/zuul/cmd/executor.py
index c600dc9..b050a59 100755
--- a/zuul/cmd/executor.py
+++ b/zuul/cmd/executor.py
@@ -14,10 +14,8 @@
 # License for the specific language governing permissions and limitations
 # under the License.
 
-import grp
 import logging
 import os
-import pwd
 import sys
 import signal
 import tempfile
@@ -65,7 +63,7 @@
 
             self.log.info("Starting log streamer")
             streamer = zuul.lib.log_streamer.LogStreamer(
-                self.user, '::', self.finger_port, self.job_dir)
+                '::', self.finger_port, self.job_dir)
 
             # Keep running until the parent dies:
             pipe_read = os.fdopen(pipe_read)
@@ -77,22 +75,6 @@
             os.close(pipe_read)
             self.log_streamer_pid = child_pid
 
-    def change_privs(self):
-        '''
-        Drop our privileges to the zuul user.
-        '''
-        if os.getuid() != 0:
-            return
-        pw = pwd.getpwnam(self.user)
-        # get a list of supplementary groups for the target user, and make sure
-        # we set them when dropping privileges.
-        groups = [g.gr_gid for g in grp.getgrall() if self.user in g.gr_mem]
-        os.setgroups(groups)
-        os.setgid(pw.pw_gid)
-        os.setuid(pw.pw_uid)
-        os.chdir(pw.pw_dir)
-        os.umask(0o022)
-
     def run(self):
         if self.args.command in zuul.executor.server.COMMANDS:
             self.send_command(self.args.command)
@@ -100,8 +82,6 @@
 
         self.configure_connections(source_only=True)
 
-        self.user = get_default(self.config, 'executor', 'user', 'zuul')
-
         if self.config.has_option('executor', 'job_dir'):
             self.job_dir = os.path.expanduser(
                 self.config.get('executor', 'job_dir'))
@@ -121,7 +101,6 @@
         )
 
         self.start_log_streamer()
-        self.change_privs()
 
         ExecutorServer = zuul.executor.server.ExecutorServer
         self.executor = ExecutorServer(self.config, self.connections,
diff --git a/zuul/configloader.py b/zuul/configloader.py
index 71c4ccc..d622370 100644
--- a/zuul/configloader.py
+++ b/zuul/configloader.py
@@ -474,6 +474,7 @@
     # Attributes of a job that can also be used in Project and ProjectTemplate
     job_attributes = {'parent': vs.Any(str, None),
                       'final': bool,
+                      'protected': bool,
                       'failure-message': str,
                       'success-message': str,
                       'failure-url': str,
@@ -513,6 +514,7 @@
 
     simple_attributes = [
         'final',
+        'protected',
         'timeout',
         'workspace',
         'voting',
@@ -852,7 +854,7 @@
 
     def getSchema(self):
         project = {
-            vs.Required('name'): str,
+            'name': str,
             'description': str,
             'templates': [str],
             'merge-mode': vs.Any('merge', 'merge-resolve',
@@ -1228,8 +1230,8 @@
                                                   tenant.config_projects,
                                                   tenant.untrusted_projects,
                                                   cached, tenant)
-        unparsed_config.extend(tenant.config_projects_config, tenant=tenant)
-        unparsed_config.extend(tenant.untrusted_projects_config, tenant=tenant)
+        unparsed_config.extend(tenant.config_projects_config, tenant)
+        unparsed_config.extend(tenant.untrusted_projects_config, tenant)
         tenant.layout = TenantParser._parseLayout(base, tenant,
                                                   unparsed_config,
                                                   scheduler,
@@ -1484,10 +1486,10 @@
                     (job.project,))
                 if job.config_project:
                     config_projects_config.extend(
-                        job.project.unparsed_config)
+                        job.project.unparsed_config, tenant)
                 else:
                     untrusted_projects_config.extend(
-                        job.project.unparsed_config)
+                        job.project.unparsed_config, tenant)
                 continue
             TenantParser.log.debug("Waiting for cat job %s" % (job,))
             job.wait()
@@ -1518,17 +1520,18 @@
                     branch = source_context.branch
                     if source_context.trusted:
                         incdata = TenantParser._parseConfigProjectLayout(
-                            job.files[fn], source_context)
-                        config_projects_config.extend(incdata)
+                            job.files[fn], source_context, tenant)
+                        config_projects_config.extend(incdata, tenant)
                     else:
                         incdata = TenantParser._parseUntrustedProjectLayout(
-                            job.files[fn], source_context)
-                        untrusted_projects_config.extend(incdata)
-                    new_project_unparsed_config[project].extend(incdata)
+                            job.files[fn], source_context, tenant)
+                        untrusted_projects_config.extend(incdata, tenant)
+                    new_project_unparsed_config[project].extend(
+                        incdata, tenant)
                     if branch in new_project_unparsed_branch_config.get(
                             project, {}):
                         new_project_unparsed_branch_config[project][branch].\
-                            extend(incdata)
+                            extend(incdata, tenant)
         # Now that we've sucessfully loaded all of the configuration,
         # cache the unparsed data on the project objects.
         for project, data in new_project_unparsed_config.items():
@@ -1540,18 +1543,18 @@
         return config_projects_config, untrusted_projects_config
 
     @staticmethod
-    def _parseConfigProjectLayout(data, source_context):
+    def _parseConfigProjectLayout(data, source_context, tenant):
         # This is the top-level configuration for a tenant.
         config = model.UnparsedTenantConfig()
         with early_configuration_exceptions(source_context):
-            config.extend(safe_load_yaml(data, source_context))
+            config.extend(safe_load_yaml(data, source_context), tenant)
         return config
 
     @staticmethod
-    def _parseUntrustedProjectLayout(data, source_context):
+    def _parseUntrustedProjectLayout(data, source_context, tenant):
         config = model.UnparsedTenantConfig()
         with early_configuration_exceptions(source_context):
-            config.extend(safe_load_yaml(data, source_context))
+            config.extend(safe_load_yaml(data, source_context), tenant)
         if config.pipelines:
             with configuration_exceptions('pipeline', config.pipelines[0]):
                 raise PipelineNotPermittedError()
@@ -1753,7 +1756,7 @@
                 else:
                     incdata = project.unparsed_branch_config.get(branch)
                 if incdata:
-                    config.extend(incdata)
+                    config.extend(incdata, tenant)
                 continue
             # Otherwise, do not use the cached config (even if the
             # files are empty as that likely means they were deleted).
@@ -1782,12 +1785,12 @@
 
                     if trusted:
                         incdata = TenantParser._parseConfigProjectLayout(
-                            data, source_context)
+                            data, source_context, tenant)
                     else:
                         incdata = TenantParser._parseUntrustedProjectLayout(
-                            data, source_context)
+                            data, source_context, tenant)
 
-                    config.extend(incdata)
+                    config.extend(incdata, tenant)
 
     def createDynamicLayout(self, tenant, files,
                             include_config_projects=False,
diff --git a/zuul/driver/gerrit/gerritconnection.py b/zuul/driver/gerrit/gerritconnection.py
index f4b090d..d3b3c00 100644
--- a/zuul/driver/gerrit/gerritconnection.py
+++ b/zuul/driver/gerrit/gerritconnection.py
@@ -442,8 +442,19 @@
         # In case this change is already in the history we have a
         # cyclic dependency and don't need to update ourselves again
         # as this gets done in a previous frame of the call stack.
-        # NOTE(jeblair): I don't think it's possible to hit this case
-        # anymore as all paths hit the change cache first.
+        # NOTE(jeblair): The only case where this can still be hit is
+        # when we get an event for a change with no associated
+        # patchset; for instance, when the gerrit topic is changed.
+        # In that case, we will update change 1234,None, which will be
+        # inserted into the cache as its own entry, but then we will
+        # resolve the patchset before adding it to the history list,
+        # then if there are dependencies, we can walk down and then
+        # back up to the version of this change with a patchset which
+        # will match the history list but will have bypassed the
+        # change cache because the previous object had a patchset of
+        # None.  All paths hit the change cache first.  To be able to
+        # drop history, we need to resolve the patchset on events with
+        # no patchsets before adding the entry to the change cache.
         if (history and change.number and change.patchset and
             (change.number, change.patchset) in history):
             self.log.debug("Change %s is in history" % (change,))
@@ -461,6 +472,11 @@
         change.project = self.source.getProject(data['project'])
         change.branch = data['branch']
         change.url = data['url']
+        change.uris = [
+            '%s/%s' % (self.server, change.number),
+            '%s/#/c/%s' % (self.server, change.number),
+        ]
+
         max_ps = 0
         files = []
         for ps in data['patchSets']:
@@ -481,6 +497,7 @@
         change.open = data['open']
         change.status = data['status']
         change.owner = data['owner']
+        change.message = data['commitMessage']
 
         if change.is_merged:
             # This change is merged, so we don't need to look any further
@@ -494,7 +511,8 @@
             history = history[:]
         history.append((change.number, change.patchset))
 
-        needs_changes = []
+        needs_changes = set()
+        git_needs_changes = []
         if 'dependsOn' in data:
             parts = data['dependsOn'][0]['ref'].split('/')
             dep_num, dep_ps = parts[3], parts[4]
@@ -505,8 +523,11 @@
             # already merged. So even if it is "ABANDONED", we should not
             # ignore it.
             if (not dep.is_merged) and dep not in needs_changes:
-                needs_changes.append(dep)
+                git_needs_changes.append(dep)
+                needs_changes.add(dep)
+        change.git_needs_changes = git_needs_changes
 
+        compat_needs_changes = []
         for record in self._getDependsOnFromCommit(data['commitMessage'],
                                                    change):
             dep_num = record['number']
@@ -516,10 +537,12 @@
                            (change, dep_num, dep_ps))
             dep = self._getChange(dep_num, dep_ps, history=history)
             if dep.open and dep not in needs_changes:
-                needs_changes.append(dep)
-        change.needs_changes = needs_changes
+                compat_needs_changes.append(dep)
+                needs_changes.add(dep)
+        change.compat_needs_changes = compat_needs_changes
 
-        needed_by_changes = []
+        needed_by_changes = set()
+        git_needed_by_changes = []
         if 'neededBy' in data:
             for needed in data['neededBy']:
                 parts = needed['ref'].split('/')
@@ -527,9 +550,13 @@
                 self.log.debug("Updating %s: Getting git-needed change %s,%s" %
                                (change, dep_num, dep_ps))
                 dep = self._getChange(dep_num, dep_ps, history=history)
-                if dep.open and dep.is_current_patchset:
-                    needed_by_changes.append(dep)
+                if (dep.open and dep.is_current_patchset and
+                    dep not in needed_by_changes):
+                    git_needed_by_changes.append(dep)
+                    needed_by_changes.add(dep)
+        change.git_needed_by_changes = git_needed_by_changes
 
+        compat_needed_by_changes = []
         for record in self._getNeededByFromCommit(data['id'], change):
             dep_num = record['number']
             dep_ps = record['currentPatchSet']['number']
@@ -543,9 +570,13 @@
             refresh = (dep_num, dep_ps) not in history
             dep = self._getChange(
                 dep_num, dep_ps, refresh=refresh, history=history)
-            if dep.open and dep.is_current_patchset:
-                needed_by_changes.append(dep)
-        change.needed_by_changes = needed_by_changes
+            if (dep.open and dep.is_current_patchset
+                and dep not in needed_by_changes):
+                compat_needed_by_changes.append(dep)
+                needed_by_changes.add(dep)
+        change.compat_needed_by_changes = compat_needed_by_changes
+
+        self.sched.onChangeUpdated(change)
 
         return change
 
diff --git a/zuul/driver/gerrit/gerritsource.py b/zuul/driver/gerrit/gerritsource.py
index 7141080..9e327b9 100644
--- a/zuul/driver/gerrit/gerritsource.py
+++ b/zuul/driver/gerrit/gerritsource.py
@@ -12,12 +12,15 @@
 # License for the specific language governing permissions and limitations
 # under the License.
 
+import re
+import urllib
 import logging
 import voluptuous as vs
 from zuul.source import BaseSource
 from zuul.model import Project
 from zuul.driver.gerrit.gerritmodel import GerritRefFilter
 from zuul.driver.util import scalar_or_list, to_list
+from zuul.lib.dependson import find_dependency_headers
 
 
 class GerritSource(BaseSource):
@@ -44,6 +47,61 @@
     def getChange(self, event, refresh=False):
         return self.connection.getChange(event, refresh)
 
+    change_re = re.compile(r"/(\#\/c\/)?(\d+)[\w]*")
+
+    def getChangeByURL(self, url):
+        try:
+            parsed = urllib.parse.urlparse(url)
+        except ValueError:
+            return None
+        m = self.change_re.match(parsed.path)
+        if not m:
+            return None
+        try:
+            change_no = int(m.group(2))
+        except ValueError:
+            return None
+        query = "change:%s" % (change_no,)
+        results = self.connection.simpleQuery(query)
+        if not results:
+            return None
+        change = self.connection._getChange(
+            results[0]['number'], results[0]['currentPatchSet']['number'])
+        return change
+
+    def getChangesDependingOn(self, change, projects):
+        changes = []
+        if not change.uris:
+            return changes
+        queries = set()
+        for uri in change.uris:
+            queries.add('message:%s' % uri)
+        query = '(' + ' OR '.join(queries) + ')'
+        results = self.connection.simpleQuery(query)
+        seen = set()
+        for result in results:
+            for match in find_dependency_headers(result['commitMessage']):
+                found = False
+                for uri in change.uris:
+                    if uri in match:
+                        found = True
+                        break
+                if not found:
+                    continue
+                key = (result['number'], result['currentPatchSet']['number'])
+                if key in seen:
+                    continue
+                seen.add(key)
+                change = self.connection._getChange(
+                    result['number'], result['currentPatchSet']['number'])
+                changes.append(change)
+        return changes
+
+    def getCachedChanges(self):
+        for x in self.connection._change_cache.values():
+            for y in x.values():
+                yield y
+
     def getProject(self, name):
         p = self.connection.getProject(name)
         if not p:
diff --git a/zuul/driver/git/__init__.py b/zuul/driver/git/__init__.py
index 0faa036..1fe43f6 100644
--- a/zuul/driver/git/__init__.py
+++ b/zuul/driver/git/__init__.py
@@ -15,6 +15,7 @@
 from zuul.driver import Driver, ConnectionInterface, SourceInterface
 from zuul.driver.git import gitconnection
 from zuul.driver.git import gitsource
+from zuul.driver.git import gittrigger
 
 
 class GitDriver(Driver, ConnectionInterface, SourceInterface):
@@ -23,9 +24,15 @@
     def getConnection(self, name, config):
         return gitconnection.GitConnection(self, name, config)
 
+    def getTrigger(self, connection, config=None):
+        return gittrigger.GitTrigger(self, connection, config)
+
     def getSource(self, connection):
         return gitsource.GitSource(self, connection)
 
+    def getTriggerSchema(self):
+        return gittrigger.getSchema()
+
     def getRequireSchema(self):
         return {}
 
diff --git a/zuul/driver/git/gitconnection.py b/zuul/driver/git/gitconnection.py
index f93824d..1886cfc 100644
--- a/zuul/driver/git/gitconnection.py
+++ b/zuul/driver/git/gitconnection.py
@@ -13,12 +13,122 @@
 # License for the specific language governing permissions and limitations
 # under the License.
 
+import os
+import git
+import time
 import logging
 import urllib
+import threading
 
 import voluptuous as v
 
 from zuul.connection import BaseConnection
+from zuul.driver.git.gitmodel import GitTriggerEvent, EMPTY_GIT_REF
+from zuul.model import Ref, Branch
+
+
+class GitWatcher(threading.Thread):
+    log = logging.getLogger("connection.git.GitWatcher")
+
+    def __init__(self, git_connection, baseurl, poll_delay):
+        threading.Thread.__init__(self)
+        self.daemon = True
+        self.git_connection = git_connection
+        self.baseurl = baseurl
+        self.poll_delay = poll_delay
+        self._stopped = False
+        self.projects_refs = self.git_connection.projects_refs
+        # This is used by the test framework
+        self._event_count = 0
+
+    def compareRefs(self, project, refs):
+        partial_events = []
+        # Fetch previous refs state
+        base_refs = self.projects_refs.get(project)
+        # Create list of created refs
+        rcreateds = set(refs.keys()) - set(base_refs.keys())
+        # Create list of deleted refs
+        rdeleteds = set(base_refs.keys()) - set(refs.keys())
+        # Create the list of updated refs
+        updateds = {}
+        for ref, sha in refs.items():
+            if ref in base_refs and base_refs[ref] != sha:
+                updateds[ref] = sha
+        for ref in rcreateds:
+            event = {
+                'ref': ref,
+                'branch_created': True,
+                'oldrev': EMPTY_GIT_REF,
+                'newrev': refs[ref]
+            }
+            partial_events.append(event)
+        for ref in rdeleteds:
+            event = {
+                'ref': ref,
+                'branch_deleted': True,
+                'oldrev': base_refs[ref],
+                'newrev': EMPTY_GIT_REF
+            }
+            partial_events.append(event)
+        for ref, sha in updateds.items():
+            event = {
+                'ref': ref,
+                'branch_updated': True,
+                'oldrev': base_refs[ref],
+                'newrev': sha
+            }
+            partial_events.append(event)
+        events = []
+        for pevent in partial_events:
+            event = GitTriggerEvent()
+            event.type = 'ref-updated'
+            event.project_hostname = self.git_connection.canonical_hostname
+            event.project_name = project
+            for attr in ('ref', 'oldrev', 'newrev', 'branch_created',
+                         'branch_deleted', 'branch_updated'):
+                if attr in pevent:
+                    setattr(event, attr, pevent[attr])
+            events.append(event)
+        return events
+
+    def _run(self):
+        self.log.debug("Walk through projects refs for connection: %s" %
+                       self.git_connection.connection_name)
+        try:
+            for project in self.git_connection.projects:
+                refs = self.git_connection.lsRemote(project)
+                self.log.debug("Read refs %s for project %s" % (refs, project))
+                if not self.projects_refs.get(project):
+                    # State for this project does not exist yet so add it.
+                    # No event will be triggered in this loop as
+                    # projects_refs['project'] and refs are equal
+                    self.projects_refs[project] = refs
+                events = self.compareRefs(project, refs)
+                self.projects_refs[project] = refs
+                # Send events to the scheduler
+                for event in events:
+                    self.log.debug("Handling event: %s" % event)
+                    # Force changes cache update before passing
+                    # the event to the scheduler
+                    self.git_connection.getChange(event)
+                    self.git_connection.logEvent(event)
+                    # Pass the event to the scheduler
+                    self.git_connection.sched.addEvent(event)
+                    self._event_count += 1
+        except Exception as e:
+            self.log.debug("Unexpected issue in _run loop: %s" % str(e))
+
+    def run(self):
+        while not self._stopped:
+            if not self.git_connection.w_pause:
+                self._run()
+                # Polling wait delay
+            else:
+                self.log.debug("Watcher is on pause")
+            time.sleep(self.poll_delay)
+
+    def stop(self):
+        self._stopped = True
 
 
 class GitConnection(BaseConnection):
@@ -32,6 +142,8 @@
             raise Exception('baseurl is required for git connections in '
                             '%s' % self.connection_name)
         self.baseurl = self.connection_config.get('baseurl')
+        self.poll_timeout = float(
+            self.connection_config.get('poll_delay', 3600 * 2))
         self.canonical_hostname = self.connection_config.get(
             'canonical_hostname')
         if not self.canonical_hostname:
@@ -40,7 +152,10 @@
                 self.canonical_hostname = r.hostname
             else:
                 self.canonical_hostname = 'localhost'
+        self.w_pause = False
         self.projects = {}
+        self.projects_refs = {}
+        self._change_cache = {}
 
     def getProject(self, name):
         return self.projects.get(name)
@@ -48,15 +163,97 @@
     def addProject(self, project):
         self.projects[project.name] = project
 
+    def getChangeFilesUpdated(self, project_name, branch, tosha):
+        job = self.sched.merger.getFilesChanges(
+            self.connection_name, project_name, branch, tosha)
+        self.log.debug("Waiting for fileschanges job %s" % job)
+        job.wait()
+        if not job.updated:
+            raise Exception("Fileschanges job %s failed" % job)
+        self.log.debug("Fileschanges job %s got changes on files %s" %
+                       (job, job.files))
+        return job.files
+
+    def lsRemote(self, project):
+        refs = {}
+        client = git.cmd.Git()
+        output = client.ls_remote(
+            os.path.join(self.baseurl, project))
+        for line in output.splitlines():
+            sha, ref = line.split('\t')
+            if ref.startswith('refs/'):
+                refs[ref] = sha
+        return refs
+
+    def maintainCache(self, relevant):
+        remove = {}
+        for branch, refschange in self._change_cache.items():
+            for ref, change in refschange.items():
+                if change not in relevant:
+                    remove.setdefault(branch, []).append(ref)
+        for branch, refs in remove.items():
+            for ref in refs:
+                del self._change_cache[branch][ref]
+            if not self._change_cache[branch]:
+                del self._change_cache[branch]
+
+    def getChange(self, event, refresh=False):
+        if event.ref and event.ref.startswith('refs/heads/'):
+            branch = event.ref[len('refs/heads/'):]
+            change = self._change_cache.get(branch, {}).get(event.newrev)
+            if change:
+                return change
+            project = self.getProject(event.project_name)
+            change = Branch(project)
+            change.branch = branch
+            for attr in ('ref', 'oldrev', 'newrev'):
+                setattr(change, attr, getattr(event, attr))
+            change.url = ""
+            change.files = self.getChangeFilesUpdated(
+                event.project_name, change.branch, event.oldrev)
+            self._change_cache.setdefault(branch, {})[event.newrev] = change
+        elif event.ref:
+            # catch-all ref (ie, not a branch or head)
+            project = self.getProject(event.project_name)
+            change = Ref(project)
+            for attr in ('ref', 'oldrev', 'newrev'):
+                setattr(change, attr, getattr(event, attr))
+            change.url = ""
+        else:
+            self.log.warning("Unable to get change for %s" % (event,))
+            change = None
+        return change
+
     def getProjectBranches(self, project, tenant):
-        # TODO(jeblair): implement; this will need to handle local or
-        # remote git urls.
-        return ['master']
+        refs = self.lsRemote(project.name)
+        branches = [ref[len('refs/heads/'):] for ref in
+                    refs if ref.startswith('refs/heads/')]
+        return branches
 
     def getGitUrl(self, project):
         url = '%s/%s' % (self.baseurl, project.name)
         return url
 
+    def onLoad(self):
+        self.log.debug("Starting Git Watcher")
+        self._start_watcher_thread()
+
+    def onStop(self):
+        self.log.debug("Stopping Git Watcher")
+        self._stop_watcher_thread()
+
+    def _stop_watcher_thread(self):
+        if self.watcher_thread:
+            self.watcher_thread.stop()
+            self.watcher_thread.join()
+
+    def _start_watcher_thread(self):
+        self.watcher_thread = GitWatcher(
+            self,
+            self.baseurl,
+            self.poll_timeout)
+        self.watcher_thread.start()
+
 
 def getSchema():
     git_connection = v.Any(str, v.Schema(dict))
diff --git a/zuul/driver/git/gitmodel.py b/zuul/driver/git/gitmodel.py
new file mode 100644
index 0000000..5d12b36
--- /dev/null
+++ b/zuul/driver/git/gitmodel.py
@@ -0,0 +1,86 @@
+# Copyright 2017 Red Hat, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import re
+
+from zuul.model import TriggerEvent
+from zuul.model import EventFilter
+
+
+EMPTY_GIT_REF = '0' * 40  # git sha of all zeros, used during creates/deletes
+
+
+class GitTriggerEvent(TriggerEvent):
+    """Incoming event from an external system."""
+
+    def __repr__(self):
+        ret = '<GitTriggerEvent %s %s' % (self.type,
+                                          self.project_name)
+
+        if self.branch:
+            ret += " %s" % self.branch
+        ret += " oldrev:%s" % self.oldrev
+        ret += " newrev:%s" % self.newrev
+        ret += '>'
+
+        return ret
+
+
+class GitEventFilter(EventFilter):
+    def __init__(self, trigger, types=[], refs=[],
+                 ignore_deletes=True):
+
+        super().__init__(trigger)
+
+        self._refs = refs
+        self.types = types
+        self.refs = [re.compile(x) for x in refs]
+        self.ignore_deletes = ignore_deletes
+
+    def __repr__(self):
+        ret = '<GitEventFilter'
+
+        if self.types:
+            ret += ' types: %s' % ', '.join(self.types)
+        if self._refs:
+            ret += ' refs: %s' % ', '.join(self._refs)
+        if self.ignore_deletes:
+            ret += ' ignore_deletes: %s' % self.ignore_deletes
+        ret += '>'
+
+        return ret
+
+    def matches(self, event, change):
+        # event types are ORed
+        matches_type = False
+        for etype in self.types:
+            if etype == event.type:
+                matches_type = True
+        if self.types and not matches_type:
+            return False
+
+        # refs are ORed
+        matches_ref = False
+        if event.ref is not None:
+            for ref in self.refs:
+                if ref.match(event.ref):
+                    matches_ref = True
+        if self.refs and not matches_ref:
+            return False
+        if self.ignore_deletes and event.newrev == EMPTY_GIT_REF:
+            # If the updated ref has an empty git sha (all 0s),
+            # then the ref is being deleted
+            return False
+
+        return True
diff --git a/zuul/driver/git/gitsource.py b/zuul/driver/git/gitsource.py
index 8d85c08..a7d42be 100644
--- a/zuul/driver/git/gitsource.py
+++ b/zuul/driver/git/gitsource.py
@@ -36,7 +36,16 @@
         raise NotImplemented()
 
     def getChange(self, event, refresh=False):
-        raise NotImplemented()
+        return self.connection.getChange(event, refresh)
+
+    def getChangeByURL(self, url):
+        return None
+
+    def getChangesDependingOn(self, change, projects):
+        return []
+
+    def getCachedChanges(self):
+        return []
 
     def getProject(self, name):
         p = self.connection.getProject(name)
diff --git a/zuul/driver/git/gittrigger.py b/zuul/driver/git/gittrigger.py
new file mode 100644
index 0000000..2885230
--- /dev/null
+++ b/zuul/driver/git/gittrigger.py
@@ -0,0 +1,49 @@
+# Copyright 2017 Red Hat, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import logging
+import voluptuous as v
+from zuul.trigger import BaseTrigger
+from zuul.driver.git.gitmodel import GitEventFilter
+from zuul.driver.util import scalar_or_list, to_list
+
+
+class GitTrigger(BaseTrigger):
+    name = 'git'
+    log = logging.getLogger("zuul.GitTrigger")
+
+    def getEventFilters(self, trigger_conf):
+        efilters = []
+        for trigger in to_list(trigger_conf):
+            f = GitEventFilter(
+                trigger=self,
+                types=to_list(trigger['event']),
+                refs=to_list(trigger.get('ref')),
+                ignore_deletes=trigger.get(
+                    'ignore-deletes', True)
+            )
+            efilters.append(f)
+
+        return efilters
+
+
+def getSchema():
+    git_trigger = {
+        v.Required('event'):
+            scalar_or_list(v.Any('ref-updated')),
+        'ref': scalar_or_list(str),
+        'ignore-deletes': bool,
+    }
+
+    return git_trigger
diff --git a/zuul/driver/github/githubconnection.py b/zuul/driver/github/githubconnection.py
index f987f47..b766c6f 100644
--- a/zuul/driver/github/githubconnection.py
+++ b/zuul/driver/github/githubconnection.py
@@ -24,6 +24,7 @@
 
 import cachecontrol
 from cachecontrol.cache import DictCache
+from cachecontrol.heuristics import BaseHeuristic
 import iso8601
 import jwt
 import requests
@@ -34,14 +35,12 @@
 import github3.exceptions
 
 from zuul.connection import BaseConnection
-from zuul.model import Ref, Branch, Tag
+from zuul.model import Ref, Branch, Tag, Project
 from zuul.exceptions import MergeFailure
 from zuul.driver.github.githubmodel import PullRequest, GithubTriggerEvent
 
-ACCESS_TOKEN_URL = 'https://api.github.com/installations/%s/access_tokens'
+GITHUB_BASE_URL = 'https://api.github.com'
 PREVIEW_JSON_ACCEPT = 'application/vnd.github.machine-man-preview+json'
-INSTALLATIONS_URL = 'https://api.github.com/app/installations'
-REPOS_URL = 'https://api.github.com/installation/repositories'
 
 
 def _sign_request(body, secret):
@@ -137,7 +136,6 @@
     """Move events from GitHub into the scheduler"""
 
     log = logging.getLogger("zuul.GithubEventConnector")
-    delay = 10.0
 
     def __init__(self, connection):
         super(GithubEventConnector, self).__init__()
@@ -153,14 +151,6 @@
         ts, json_body, event_type = self.connection.getEvent()
         if self._stopped:
             return
-        # Github can produce inconsistent data immediately after an
-        # event, So ensure that we do not deliver the event to Zuul
-        # until at least a certain amount of time has passed.  Note
-        # that if we receive several events in succession, we will
-        # only need to delay for the first event.  In essence, Zuul
-        # should always be a constant number of seconds behind Github.
-        now = time.time()
-        time.sleep(max((ts + self.delay) - now, 0.0))
 
         # If there's any installation mapping information in the body then
         # update the project mapping before any requests are made.
@@ -351,7 +341,9 @@
     def _get_sender(self, body):
         login = body.get('sender').get('login')
         if login:
-            return self.connection.getUser(login)
+            # TODO(tobiash): it might be better to plumb in the installation id
+            project = body.get('repository', {}).get('full_name')
+            return self.connection.getUser(login, project=project)
 
     def run(self):
         while True:
@@ -415,6 +407,11 @@
         self.source = driver.getSource(self)
         self.event_queue = queue.Queue()
 
+        if self.server == 'github.com':
+            self.base_url = GITHUB_BASE_URL
+        else:
+            self.base_url = 'https://%s/api/v3' % self.server
+
         # ssl verification must default to true
         verify_ssl = self.connection_config.get('verify_ssl', 'true')
         self.verify_ssl = True
@@ -431,9 +428,26 @@
         # NOTE(jamielennox): Better here would be to cache to memcache or file
         # or something external - but zuul already sucks at restarting so in
         # memory probably doesn't make this much worse.
+
+        # NOTE(tobiash): Unlike documented cachecontrol doesn't priorize
+        # the etag caching but doesn't even re-request until max-age was
+        # elapsed.
+        #
+        # Thus we need to add a custom caching heuristic which simply drops
+        # the cache-control header containing max-age. This way we force
+        # cachecontrol to only rely on the etag headers.
+        #
+        # http://cachecontrol.readthedocs.io/en/latest/etags.html
+        # http://cachecontrol.readthedocs.io/en/latest/custom_heuristics.html
+        class NoAgeHeuristic(BaseHeuristic):
+            def update_headers(self, response):
+                if 'cache-control' in response.headers:
+                    del response.headers['cache-control']
+
         self.cache_adapter = cachecontrol.CacheControlAdapter(
             DictCache(),
-            cache_etags=True)
+            cache_etags=True,
+            heuristic=NoAgeHeuristic())
 
         # The regex is based on the connection host. We do not yet support
         # cross-connection dependency gathering
@@ -530,12 +544,21 @@
 
         return headers
 
-    def _get_installation_key(self, project, user_id=None, inst_id=None):
+    def _get_installation_key(self, project, user_id=None, inst_id=None,
+                              reprime=False):
         installation_id = inst_id
         if project is not None:
             installation_id = self.installation_map.get(project)
 
         if not installation_id:
+            if reprime:
+                # prime installation map and try again without refreshing
+                self._prime_installation_map()
+                return self._get_installation_key(project,
+                                                  user_id=user_id,
+                                                  inst_id=inst_id,
+                                                  reprime=False)
+
             self.log.error("No installation ID available for project %s",
                            project)
             return ''
@@ -546,7 +569,10 @@
 
         if ((not expiry) or (not token) or (now >= expiry)):
             headers = self._get_app_auth_headers()
-            url = ACCESS_TOKEN_URL % installation_id
+
+            url = "%s/installations/%s/access_tokens" % (self.base_url,
+                                                         installation_id)
+
             json_data = {'user_id': user_id} if user_id else None
 
             response = requests.post(url, headers=headers, json=json_data)
@@ -568,7 +594,8 @@
         if not self.app_id:
             return
 
-        url = INSTALLATIONS_URL
+        url = '%s/app/installations' % self.base_url
+
         headers = self._get_app_auth_headers()
         self.log.debug("Fetching installations for GitHub app")
         response = requests.get(url, headers=headers)
@@ -581,7 +608,9 @@
             token = self._get_installation_key(project=None, inst_id=inst_id)
             headers = {'Accept': PREVIEW_JSON_ACCEPT,
                        'Authorization': 'token %s' % token}
-            url = REPOS_URL
+
+            url = '%s/installation/repositories' % self.base_url
+
             self.log.debug("Fetching repos for install %s" % inst_id)
             response = requests.get(url, headers=headers)
             response.raise_for_status()
@@ -617,9 +646,12 @@
         return self._github
 
     def maintainCache(self, relevant):
+        remove = set()
         for key, change in self._change_cache.items():
             if change not in relevant:
-                del self._change_cache[key]
+                remove.add(key)
+        for key in remove:
+            del self._change_cache[key]
 
     def getChange(self, event, refresh=False):
         """Get the change representing an event."""
@@ -629,7 +661,9 @@
             change = self._getChange(project, event.change_number,
                                      event.patch_number, refresh=refresh)
             change.url = event.change_url
-            change.updated_at = self._ghTimestampToDate(event.updated_at)
+            change.uris = [
+                '%s/%s/pull/%s' % (self.server, project, change.number),
+            ]
             change.source_event = event
             change.is_current_patchset = (change.pr.get('head').get('sha') ==
                                           event.patch_number)
@@ -650,8 +684,7 @@
             change.files = self.getPushedFileNames(event)
         return change
 
-    def _getChange(self, project, number, patchset=None, refresh=False,
-                   history=None):
+    def _getChange(self, project, number, patchset=None, refresh=False):
         key = (project.name, number, patchset)
         change = self._change_cache.get(key)
         if change and not refresh:
@@ -663,70 +696,79 @@
             change.patchset = patchset
         self._change_cache[key] = change
         try:
-            self._updateChange(change, history)
+            self._updateChange(change)
         except Exception:
             if key in self._change_cache:
                 del self._change_cache[key]
             raise
         return change
 
-    def _getDependsOnFromPR(self, body):
-        prs = []
-        seen = set()
+    def getChangesDependingOn(self, change, projects):
+        changes = []
+        if not change.uris:
+            return changes
 
-        for match in self.depends_on_re.findall(body):
-            if match in seen:
-                self.log.debug("Ignoring duplicate Depends-On: %s" % (match,))
-                continue
-            seen.add(match)
-            # Get the github url
-            url = match.rsplit()[-1]
-            # break it into the parts we need
-            _, org, proj, _, num = url.rsplit('/', 4)
-            # Get a pull object so we can get the head sha
-            pull = self.getPull('%s/%s' % (org, proj), int(num))
-            prs.append(pull)
+        # Get a list of projects with unique installation ids
+        installation_ids = set()
+        installation_projects = set()
 
-        return prs
+        if projects:
+            # We only need to find changes in projects in the supplied
+            # ChangeQueue.  Find all of the github installations for
+            # all of those projects, and search using each of them, so
+            # that if we get the right results based on the
+            # permissions granted to each of the installations.  The
+            # common case for this is likely to be just one
+            # installation -- change queues aren't likely to span more
+            # than one installation.
+            for project in projects:
+                installation_id = self.installation_map.get(project)
+                if installation_id not in installation_ids:
+                    installation_ids.add(installation_id)
+                    installation_projects.add(project)
+        else:
+            # We aren't in the context of a change queue and we just
+            # need to query all installations.  This currently only
+            # happens if certain features of the zuul trigger are
+            # used; generally it should be avoided.
+            for project, installation_id in self.installation_map.items():
+                if installation_id not in installation_ids:
+                    installation_ids.add(installation_id)
+                    installation_projects.add(project)
 
-    def _getNeededByFromPR(self, change):
-        prs = []
-        seen = set()
-        # This shouldn't return duplicate issues, but code as if it could
-
-        # This leaves off the protocol, but looks for the specific GitHub
-        # hostname, the org/project, and the pull request number.
-        pattern = 'Depends-On %s/%s/pull/%s' % (self.server,
-                                                change.project.name,
-                                                change.number)
+        keys = set()
+        pattern = ' OR '.join(change.uris)
         query = '%s type:pr is:open in:body' % pattern
-        github = self.getGithubClient()
-        for issue in github.search_issues(query=query):
-            pr = issue.issue.pull_request().as_dict()
-            if not pr.get('url'):
-                continue
-            if issue in seen:
-                continue
-            # the issue provides no good description of the project :\
-            org, proj, _, num = pr.get('url').split('/')[-4:]
-            self.log.debug("Found PR %s/%s/%s needs %s/%s" %
-                           (org, proj, num, change.project.name,
-                            change.number))
-            prs.append(pr)
-            seen.add(issue)
+        # Repeat the search for each installation id (project)
+        for installation_project in installation_projects:
+            github = self.getGithubClient(installation_project)
+            for issue in github.search_issues(query=query):
+                pr = issue.issue.pull_request().as_dict()
+                if not pr.get('url'):
+                    continue
+                # the issue provides no good description of the project :\
+                org, proj, _, num = pr.get('url').split('/')[-4:]
+                proj = pr.get('base').get('repo').get('full_name')
+                sha = pr.get('head').get('sha')
+                key = (proj, num, sha)
+                if key in keys:
+                    continue
+                self.log.debug("Found PR %s/%s needs %s/%s" %
+                               (proj, num, change.project.name,
+                                change.number))
+                keys.add(key)
+            self.log.debug("Ran search issues: %s", query)
+            log_rate_limit(self.log, github)
 
-        self.log.debug("Ran search issues: %s", query)
-        log_rate_limit(self.log, github)
-        return prs
+        for key in keys:
+            (proj, num, sha) = key
+            project = self.source.getProject(proj)
+            change = self._getChange(project, int(num), patchset=sha)
+            changes.append(change)
 
-    def _updateChange(self, change, history=None):
+        return changes
 
-        # If this change is already in the history, we have a cyclic
-        # dependency loop and we do not need to update again, since it
-        # was done in a previous frame.
-        if history and (change.project.name, change.number) in history:
-            return change
-
+    def _updateChange(self, change):
         self.log.info("Updating %s" % (change,))
         change.pr = self.getPull(change.project.name, change.number)
         change.ref = "refs/pull/%s/head" % change.number
@@ -740,63 +782,37 @@
         change.reviews = self.getPullReviews(change.project,
                                              change.number)
         change.labels = change.pr.get('labels')
-        change.body = change.pr.get('body')
-        # ensure body is at least an empty string
-        if not change.body:
-            change.body = ''
+        # ensure message is at least an empty string
+        change.message = change.pr.get('body') or ''
+        change.updated_at = self._ghTimestampToDate(
+            change.pr.get('updated_at'))
 
-        if history is None:
-            history = []
-        else:
-            history = history[:]
-        history.append((change.project.name, change.number))
-
-        needs_changes = []
-
-        # Get all the PRs this may depend on
-        for pr in self._getDependsOnFromPR(change.body):
-            proj = pr.get('base').get('repo').get('full_name')
-            pull = pr.get('number')
-            self.log.debug("Updating %s: Getting dependent "
-                           "pull request %s/%s" %
-                           (change, proj, pull))
-            project = self.source.getProject(proj)
-            dep = self._getChange(project, pull,
-                                  patchset=pr.get('head').get('sha'),
-                                  history=history)
-            if (not dep.is_merged) and dep not in needs_changes:
-                needs_changes.append(dep)
-
-        change.needs_changes = needs_changes
-
-        needed_by_changes = []
-        for pr in self._getNeededByFromPR(change):
-            proj = pr.get('base').get('repo').get('full_name')
-            pull = pr.get('number')
-            self.log.debug("Updating %s: Getting needed "
-                           "pull request %s/%s" %
-                           (change, proj, pull))
-            project = self.source.getProject(proj)
-            dep = self._getChange(project, pull,
-                                  patchset=pr.get('head').get('sha'),
-                                  history=history)
-            if not dep.is_merged:
-                needed_by_changes.append(dep)
-        change.needed_by_changes = needed_by_changes
+        self.sched.onChangeUpdated(change)
 
         return change
 
-    def getGitUrl(self, project):
+    def getGitUrl(self, project: Project):
         if self.git_ssh_key:
-            return 'ssh://git@%s/%s.git' % (self.server, project)
+            return 'ssh://git@%s/%s.git' % (self.server, project.name)
+
+        # if app_id is configured but self.app_id is empty we are not
+        # authenticated yet against github as app
+        if not self.app_id and self.connection_config.get('app_id', None):
+            self._authenticateGithubAPI()
+            self._prime_installation_map()
 
         if self.app_id:
-            installation_key = self._get_installation_key(project)
+            # We may be in the context of a merger or executor here. The
+            # mergers and executors don't receive webhook events so they miss
+            # new repository installations. In order to cope with this we need
+            # to reprime the installation map if we don't find the repo there.
+            installation_key = self._get_installation_key(project.name,
+                                                          reprime=True)
             return 'https://x-access-token:%s@%s/%s' % (installation_key,
                                                         self.server,
-                                                        project)
+                                                        project.name)
 
-        return 'https://%s/%s' % (self.server, project)
+        return 'https://%s/%s' % (self.server, project.name)
 
     def getGitwebUrl(self, project, sha=None):
         url = 'https://%s/%s' % (self.server, project)
@@ -956,8 +972,8 @@
         log_rate_limit(self.log, github)
         return reviews
 
-    def getUser(self, login):
-        return GithubUser(self.getGithubClient(), login)
+    def getUser(self, login, project=None):
+        return GithubUser(self.getGithubClient(project), login)
 
     def getUserUri(self, login):
         return 'https://%s/%s' % (self.server, login)
diff --git a/zuul/driver/github/githubmodel.py b/zuul/driver/github/githubmodel.py
index ffd1c3f..0731dd7 100644
--- a/zuul/driver/github/githubmodel.py
+++ b/zuul/driver/github/githubmodel.py
@@ -37,7 +37,8 @@
         self.labels = []
 
     def isUpdateOf(self, other):
-        if (hasattr(other, 'number') and self.number == other.number and
+        if (self.project == other.project and
+            hasattr(other, 'number') and self.number == other.number and
             hasattr(other, 'patchset') and self.patchset != other.patchset and
             hasattr(other, 'updated_at') and
             self.updated_at > other.updated_at):
diff --git a/zuul/driver/github/githubreporter.py b/zuul/driver/github/githubreporter.py
index 505757f..848ae1b 100644
--- a/zuul/driver/github/githubreporter.py
+++ b/zuul/driver/github/githubreporter.py
@@ -75,6 +75,14 @@
                     msg = self._formatItemReportMergeFailure(item)
                     self.addPullComment(item, msg)
 
+    def _formatItemReportJobs(self, item):
+        # Return the list of jobs portion of the report
+        ret = ''
+        jobs_fields = self._getItemReportJobsFields(item)
+        for job_fields in jobs_fields:
+            ret += '- [%s](%s) : %s%s%s%s\n' % job_fields
+        return ret
+
     def addPullComment(self, item, comment=None):
         message = comment or self._formatItemReport(item)
         project = item.change.project.name
diff --git a/zuul/driver/github/githubsource.py b/zuul/driver/github/githubsource.py
index 1e7e07a..33f8f7c 100644
--- a/zuul/driver/github/githubsource.py
+++ b/zuul/driver/github/githubsource.py
@@ -12,6 +12,8 @@
 # License for the specific language governing permissions and limitations
 # under the License.
 
+import re
+import urllib
 import logging
 import time
 import voluptuous as v
@@ -44,6 +46,8 @@
         if not change.number:
             # Not a pull request, considering merged.
             return True
+        # We don't need to perform another query because the API call
+        # to perform the merge will ensure this is updated.
         return change.is_merged
 
     def canMerge(self, change, allow_needs):
@@ -61,6 +65,38 @@
     def getChange(self, event, refresh=False):
         return self.connection.getChange(event, refresh)
 
+    change_re = re.compile(r"/(.*?)/(.*?)/pull/(\d+)[\w]*")
+
+    def getChangeByURL(self, url):
+        try:
+            parsed = urllib.parse.urlparse(url)
+        except ValueError:
+            return None
+        m = self.change_re.match(parsed.path)
+        if not m:
+            return None
+        org = m.group(1)
+        proj = m.group(2)
+        try:
+            num = int(m.group(3))
+        except ValueError:
+            return None
+        pull = self.connection.getPull('%s/%s' % (org, proj), int(num))
+        if not pull:
+            return None
+        proj = pull.get('base').get('repo').get('full_name')
+        project = self.getProject(proj)
+        change = self.connection._getChange(
+            project, num,
+            patchset=pull.get('head').get('sha'))
+        return change
+
+    def getChangesDependingOn(self, change, projects):
+        return self.connection.getChangesDependingOn(change, projects)
+
+    def getCachedChanges(self):
+        return self.connection._change_cache.values()
+
     def getProject(self, name):
         p = self.connection.getProject(name)
         if not p:
diff --git a/zuul/driver/sql/alembic.ini b/zuul/driver/sql/alembic.ini
new file mode 100644
index 0000000..e94d496
--- /dev/null
+++ b/zuul/driver/sql/alembic.ini
@@ -0,0 +1,2 @@
+[alembic]
+script_location = alembic
diff --git a/zuul/driver/sql/alembic/versions/19d3a3ebfe1d_change_patchset_to_string.py b/zuul/driver/sql/alembic/versions/19d3a3ebfe1d_change_patchset_to_string.py
new file mode 100644
index 0000000..505a1ed
--- /dev/null
+++ b/zuul/driver/sql/alembic/versions/19d3a3ebfe1d_change_patchset_to_string.py
@@ -0,0 +1,29 @@
+"""Change patchset to string
+
+Revision ID: 19d3a3ebfe1d
+Revises: cfc0dc45f341
+Create Date: 2018-01-10 07:42:16.546751
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = '19d3a3ebfe1d'
+down_revision = 'cfc0dc45f341'
+branch_labels = None
+depends_on = None
+
+from alembic import op
+import sqlalchemy as sa
+
+BUILDSET_TABLE = 'zuul_buildset'
+
+
+def upgrade(table_prefix=''):
+    op.alter_column(table_prefix + BUILDSET_TABLE,
+                    'patchset',
+                    type_=sa.String(255),
+                    existing_nullable=True)
+
+
+def downgrade():
+    raise Exception("Downgrades not supported")
diff --git a/zuul/driver/sql/alembic/versions/cfc0dc45f341_change_patchset_to_string.py b/zuul/driver/sql/alembic/versions/cfc0dc45f341_change_patchset_to_string.py
new file mode 100644
index 0000000..3fde8e5
--- /dev/null
+++ b/zuul/driver/sql/alembic/versions/cfc0dc45f341_change_patchset_to_string.py
@@ -0,0 +1,30 @@
+"""Change patchset to string
+
+Revision ID: cfc0dc45f341
+Revises: ba4cdce9b18c
+Create Date: 2018-01-09 16:44:31.506958
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = 'cfc0dc45f341'
+down_revision = 'ba4cdce9b18c'
+branch_labels = None
+depends_on = None
+
+from alembic import op
+import sqlalchemy as sa
+
+BUILDSET_TABLE = 'zuul_buildset'
+
+
+def upgrade(table_prefix=''):
+    op.alter_column(table_prefix + BUILDSET_TABLE,
+                    'patchset',
+                    sa.String(255),
+                    existing_nullable=True,
+                    existing_type=sa.Integer)
+
+
+def downgrade():
+    raise Exception("Downgrades not supported")
diff --git a/zuul/driver/sql/sqlconnection.py b/zuul/driver/sql/sqlconnection.py
index 285d0c2..715d72b 100644
--- a/zuul/driver/sql/sqlconnection.py
+++ b/zuul/driver/sql/sqlconnection.py
@@ -92,7 +92,7 @@
             sa.Column('pipeline', sa.String(255)),
             sa.Column('project', sa.String(255)),
             sa.Column('change', sa.Integer, nullable=True),
-            sa.Column('patchset', sa.Integer, nullable=True),
+            sa.Column('patchset', sa.String(255), nullable=True),
             sa.Column('ref', sa.String(255)),
             sa.Column('oldrev', sa.String(255)),
             sa.Column('newrev', sa.String(255)),
diff --git a/zuul/driver/zuul/__init__.py b/zuul/driver/zuul/__init__.py
index 0f6ec7d..e381137 100644
--- a/zuul/driver/zuul/__init__.py
+++ b/zuul/driver/zuul/__init__.py
@@ -90,7 +90,18 @@
         if not hasattr(change, 'needed_by_changes'):
             self.log.debug("  %s does not support dependencies" % type(change))
             return
-        for needs in change.needed_by_changes:
+
+        # This is very inefficient, especially on systems with large
+        # numbers of github installations.  This can be improved later
+        # with persistent storage of dependency information.
+        needed_by_changes = set(change.needed_by_changes)
+        for source in self.sched.connections.getSources():
+            self.log.debug("  Checking source: %s", source)
+            needed_by_changes.update(
+                source.getChangesDependingOn(change, None))
+        self.log.debug("  Following changes: %s", needed_by_changes)
+
+        for needs in needed_by_changes:
             self._createParentChangeEnqueuedEvent(needs, pipeline)
 
     def _createParentChangeEnqueuedEvent(self, change, pipeline):
diff --git a/zuul/executor/client.py b/zuul/executor/client.py
index 06c2087..b21a290 100644
--- a/zuul/executor/client.py
+++ b/zuul/executor/client.py
@@ -245,7 +245,7 @@
         for change in dependent_changes:
             # We have to find the project this way because it may not
             # be registered in the tenant (ie, a foreign project).
-            source = self.sched.connections.getSourceByHostname(
+            source = self.sched.connections.getSourceByCanonicalHostname(
                 change['project']['canonical_hostname'])
             project = source.getProject(change['project']['name'])
             if project not in projects:
diff --git a/zuul/executor/server.py b/zuul/executor/server.py
index 7a93f89..a8ab8c4 100644
--- a/zuul/executor/server.py
+++ b/zuul/executor/server.py
@@ -44,7 +44,8 @@
 BUFFER_LINES_FOR_SYNTAX = 200
 COMMANDS = ['stop', 'pause', 'unpause', 'graceful', 'verbose',
             'unverbose', 'keep', 'nokeep']
-DEFAULT_FINGER_PORT = 79
+DEFAULT_FINGER_PORT = 7900
+BLACKLISTED_ANSIBLE_CONNECTION_TYPES = ['network_cli']
 
 
 class StopException(Exception):
@@ -347,6 +348,8 @@
             pass
         self.known_hosts = os.path.join(ssh_dir, 'known_hosts')
         self.inventory = os.path.join(self.ansible_root, 'inventory.yaml')
+        self.setup_inventory = os.path.join(self.ansible_root,
+                                            'setup-inventory.yaml')
         self.logging_json = os.path.join(self.ansible_root, 'logging.json')
         self.playbooks = []  # The list of candidate playbooks
         self.playbook = None  # A pointer to the candidate we have chosen
@@ -493,6 +496,26 @@
                 shutil.copy(os.path.join(library_path, fn), target_dir)
 
 
+def make_setup_inventory_dict(nodes):
+
+    hosts = {}
+    for node in nodes:
+        if (node['host_vars']['ansible_connection'] in
+            BLACKLISTED_ANSIBLE_CONNECTION_TYPES):
+            continue
+
+        for name in node['name']:
+            hosts[name] = node['host_vars']
+
+    inventory = {
+        'all': {
+            'hosts': hosts,
+        }
+    }
+
+    return inventory
+
+
 def make_inventory_dict(nodes, groups, all_vars):
 
     hosts = {}
@@ -931,6 +954,10 @@
             if username:
                 host_vars['ansible_user'] = username
 
+            connection_type = node.get('connection_type')
+            if connection_type:
+                host_vars['ansible_connection'] = connection_type
+
             host_keys = []
             for key in node.get('host_keys'):
                 if port != 22:
@@ -1153,8 +1180,13 @@
             result_data_file=self.jobdir.result_data_file)
 
         nodes = self.getHostList(args)
+        setup_inventory = make_setup_inventory_dict(nodes)
         inventory = make_inventory_dict(nodes, args['groups'], all_vars)
 
+        with open(self.jobdir.setup_inventory, 'w') as setup_inventory_yaml:
+            setup_inventory_yaml.write(
+                yaml.safe_dump(setup_inventory, default_flow_style=False))
+
         with open(self.jobdir.inventory, 'w') as inventory_yaml:
             inventory_yaml.write(
                 yaml.safe_dump(inventory, default_flow_style=False))
@@ -1419,6 +1451,7 @@
             verbose = '-v'
 
         cmd = ['ansible', '*', verbose, '-m', 'setup',
+               '-i', self.jobdir.setup_inventory,
                '-a', 'gather_subset=!all']
 
         result, code = self.runAnsible(
@@ -1706,6 +1739,7 @@
         self.merger_worker.registerFunction("merger:merge")
         self.merger_worker.registerFunction("merger:cat")
         self.merger_worker.registerFunction("merger:refstate")
+        self.merger_worker.registerFunction("merger:fileschanges")
 
     def register_work(self):
         if self._running:
@@ -1859,6 +1893,9 @@
             elif job.name == 'merger:refstate':
                 self.log.debug("Got refstate job: %s" % job.unique)
                 self.refstate(job)
+            elif job.name == 'merger:fileschanges':
+                self.log.debug("Got fileschanges job: %s" % job.unique)
+                self.fileschanges(job)
             else:
                 self.log.error("Unable to handle job %s" % job.name)
                 job.sendWorkFail()
@@ -1970,6 +2007,19 @@
                       files=files)
         job.sendWorkComplete(json.dumps(result))
 
+    def fileschanges(self, job):
+        args = json.loads(job.arguments)
+        task = self.update(args['connection'], args['project'])
+        task.wait()
+        with self.merger_lock:
+            files = self.merger.getFilesChanges(
+                args['connection'], args['project'],
+                args['branch'],
+                args['tosha'])
+        result = dict(updated=True,
+                      files=files)
+        job.sendWorkComplete(json.dumps(result))
+
     def refstate(self, job):
         args = json.loads(job.arguments)
         with self.merger_lock:
diff --git a/zuul/lib/connections.py b/zuul/lib/connections.py
index 262490a..33c66f9 100644
--- a/zuul/lib/connections.py
+++ b/zuul/lib/connections.py
@@ -14,6 +14,7 @@
 
 import logging
 import re
+from collections import OrderedDict
 
 import zuul.driver.zuul
 import zuul.driver.gerrit
@@ -38,7 +39,7 @@
     log = logging.getLogger("zuul.ConnectionRegistry")
 
     def __init__(self):
-        self.connections = {}
+        self.connections = OrderedDict()
         self.drivers = {}
 
         self.registerDriver(zuul.driver.zuul.ZuulDriver())
@@ -85,7 +86,7 @@
 
     def configure(self, config, source_only=False):
         # Register connections from the config
-        connections = {}
+        connections = OrderedDict()
 
         for section_name in config.sections():
             con_match = re.match(r'^connection ([\'\"]?)(.*)(\1)$',
@@ -154,6 +155,13 @@
         connection = self.connections[connection_name]
         return connection.driver.getSource(connection)
 
+    def getSources(self):
+        sources = []
+        for connection in self.connections.values():
+            if hasattr(connection.driver, 'getSource'):
+                sources.append(connection.driver.getSource(connection))
+        return sources
+
     def getReporter(self, connection_name, config=None):
         connection = self.connections[connection_name]
         return connection.driver.getReporter(connection, config)
@@ -162,7 +170,7 @@
         connection = self.connections[connection_name]
         return connection.driver.getTrigger(connection, config)
 
-    def getSourceByHostname(self, canonical_hostname):
+    def getSourceByCanonicalHostname(self, canonical_hostname):
         for connection in self.connections.values():
             if hasattr(connection, 'canonical_hostname'):
                 if connection.canonical_hostname == canonical_hostname:
diff --git a/zuul/lib/dependson.py b/zuul/lib/dependson.py
new file mode 100644
index 0000000..cd0f6ef
--- /dev/null
+++ b/zuul/lib/dependson.py
@@ -0,0 +1,29 @@
+# Copyright 2018 Red Hat, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"); you may
+# not use this file except in compliance with the License. You may obtain
+# a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+# License for the specific language governing permissions and limitations
+# under the License.
+
+import re
+
+
+DEPENDS_ON_RE = re.compile(r"^Depends-On: (.*?)\s*$",
+                           re.MULTILINE | re.IGNORECASE)
+
+
+def find_dependency_headers(message):
+    # Search for Depends-On headers
+    dependencies = []
+    for match in DEPENDS_ON_RE.findall(message):
+        if match in dependencies:
+            continue
+        dependencies.append(match)
+    return dependencies
diff --git a/zuul/lib/fingergw.py b/zuul/lib/fingergw.py
index c89ed0f..b56fe04 100644
--- a/zuul/lib/fingergw.py
+++ b/zuul/lib/fingergw.py
@@ -66,11 +66,19 @@
         try:
             build_uuid = self.getCommand()
             port_location = self.rpc.get_job_log_stream_address(build_uuid)
+
+            if not port_location:
+                msg = 'Invalid build UUID %s' % build_uuid
+                self.request.sendall(msg.encode('utf-8'))
+                return
+
             self._fingerClient(
                 port_location['server'],
                 port_location['port'],
                 build_uuid,
             )
+        except BrokenPipeError:   # Client disconnect
+            return
         except Exception:
             self.log.exception('Finger request handling exception:')
             msg = 'Internal streaming error'
diff --git a/zuul/lib/log_streamer.py b/zuul/lib/log_streamer.py
index 5c894b4..f96f442 100644
--- a/zuul/lib/log_streamer.py
+++ b/zuul/lib/log_streamer.py
@@ -56,8 +56,6 @@
             self.request.sendall(msg.encode("utf-8"))
             return
 
-        build_uuid = build_uuid.rstrip()
-
         # validate build ID
         if not re.match("[0-9A-Fa-f]+$", build_uuid):
             msg = 'Build ID %s is not valid' % build_uuid
@@ -159,12 +157,11 @@
     Class implementing log streaming over the finger daemon port.
     '''
 
-    def __init__(self, user, host, port, jobdir_root):
+    def __init__(self, host, port, jobdir_root):
         self.log = logging.getLogger('zuul.log_streamer')
         self.log.debug("LogStreamer starting on port %s", port)
         self.server = LogStreamerServer((host, port),
                                         RequestHandler,
-                                        user=user,
                                         jobdir_root=jobdir_root)
 
         # We start the actual serving within a thread so we can return to
diff --git a/zuul/lib/streamer_utils.py b/zuul/lib/streamer_utils.py
index 985f3c3..3d2d561 100644
--- a/zuul/lib/streamer_utils.py
+++ b/zuul/lib/streamer_utils.py
@@ -60,7 +60,8 @@
                 ret = buffer.decode('utf-8')
                 x = ret.find('\n')
                 if x > 0:
-                    return ret[:x]
+                    # rstrip to remove any other unnecessary chars (e.g. \r)
+                    return ret[:x].rstrip()
             except UnicodeDecodeError:
                 pass
 
@@ -73,7 +74,7 @@
     address_family = socket.AF_INET6
 
     def __init__(self, *args, **kwargs):
-        self.user = kwargs.pop('user')
+        self.user = kwargs.pop('user', None)
         self.pid_file = kwargs.pop('pid_file', None)
         socketserver.ThreadingTCPServer.__init__(self, *args, **kwargs)
 
diff --git a/zuul/manager/__init__.py b/zuul/manager/__init__.py
index d205afc..b8a280f 100644
--- a/zuul/manager/__init__.py
+++ b/zuul/manager/__init__.py
@@ -12,9 +12,11 @@
 
 import logging
 import textwrap
+import urllib
 
 from zuul import exceptions
 from zuul import model
+from zuul.lib.dependson import find_dependency_headers
 
 
 class DynamicChangeQueueContextManager(object):
@@ -343,6 +345,32 @@
         self.dequeueItem(item)
         self.reportStats(item)
 
+    def updateCommitDependencies(self, change, change_queue):
+        # Search for Depends-On headers and find appropriate changes
+        self.log.debug("  Updating commit dependencies for %s", change)
+        change.refresh_deps = False
+        dependencies = []
+        seen = set()
+        for match in find_dependency_headers(change.message):
+            self.log.debug("  Found Depends-On header: %s", match)
+            if match in seen:
+                continue
+            seen.add(match)
+            try:
+                url = urllib.parse.urlparse(match)
+            except ValueError:
+                continue
+            source = self.sched.connections.getSourceByCanonicalHostname(
+                url.hostname)
+            if not source:
+                continue
+            self.log.debug("  Found source: %s", source)
+            dep = source.getChangeByURL(match)
+            if dep and (not dep.is_merged) and dep not in dependencies:
+                self.log.debug("  Adding dependency: %s", dep)
+                dependencies.append(dep)
+        change.commit_needs_changes = dependencies
+
     def provisionNodes(self, item):
         jobs = item.findJobsToRequest()
         if not jobs:
diff --git a/zuul/manager/dependent.py b/zuul/manager/dependent.py
index 5aef453..5ad7611 100644
--- a/zuul/manager/dependent.py
+++ b/zuul/manager/dependent.py
@@ -95,12 +95,29 @@
     def enqueueChangesBehind(self, change, quiet, ignore_requirements,
                              change_queue):
         self.log.debug("Checking for changes needing %s:" % change)
-        to_enqueue = []
-        source = change.project.source
         if not hasattr(change, 'needed_by_changes'):
             self.log.debug("  %s does not support dependencies" % type(change))
             return
-        for other_change in change.needed_by_changes:
+
+        # for project in change_queue, project.source get changes, then dedup.
+        sources = set()
+        for project in change_queue.projects:
+            sources.add(project.source)
+
+        seen = set(change.needed_by_changes)
+        needed_by_changes = change.needed_by_changes[:]
+        for source in sources:
+            self.log.debug("  Checking source: %s", source)
+            for c in source.getChangesDependingOn(change,
+                                                  change_queue.projects):
+                if c not in seen:
+                    seen.add(c)
+                    needed_by_changes.append(c)
+
+        self.log.debug("  Following changes: %s", needed_by_changes)
+
+        to_enqueue = []
+        for other_change in needed_by_changes:
             with self.getChangeQueue(other_change) as other_change_queue:
                 if other_change_queue != change_queue:
                     self.log.debug("  Change %s in project %s can not be "
@@ -108,6 +125,7 @@
                                    (other_change, other_change.project,
                                     change_queue))
                     continue
+            source = other_change.project.source
             if source.canMerge(other_change, self.getSubmitAllowNeeds()):
                 self.log.debug("  Change %s needs %s and is ready to merge" %
                                (other_change, change))
@@ -123,13 +141,16 @@
 
     def enqueueChangesAhead(self, change, quiet, ignore_requirements,
                             change_queue, history=None):
-        if history and change.number in history:
+        if history and change in history:
             # detected dependency cycle
             self.log.warn("Dependency cycle detected")
             return False
         if hasattr(change, 'number'):
             history = history or []
-            history.append(change.number)
+            history = history + [change]
+        else:
+            # Don't enqueue dependencies ahead of a non-change ref.
+            return True
 
         ret = self.checkForChangesNeededBy(change, change_queue)
         if ret in [True, False]:
@@ -145,10 +166,12 @@
         return True
 
     def checkForChangesNeededBy(self, change, change_queue):
-        self.log.debug("Checking for changes needed by %s:" % change)
-        source = change.project.source
         # Return true if okay to proceed enqueing this change,
         # false if the change should not be enqueued.
+        self.log.debug("Checking for changes needed by %s:" % change)
+        if (hasattr(change, 'commit_needs_changes') and
+            (change.refresh_deps or change.commit_needs_changes is None)):
+            self.updateCommitDependencies(change, change_queue)
         if not hasattr(change, 'needs_changes'):
             self.log.debug("  %s does not support dependencies" % type(change))
             return True
@@ -180,7 +203,8 @@
                     self.log.debug("  Needed change is already ahead "
                                    "in the queue")
                     continue
-                if source.canMerge(needed_change, self.getSubmitAllowNeeds()):
+                if needed_change.project.source.canMerge(
+                        needed_change, self.getSubmitAllowNeeds()):
                     self.log.debug("  Change %s is needed" % needed_change)
                     if needed_change not in changes_needed:
                         changes_needed.append(needed_change)
diff --git a/zuul/manager/independent.py b/zuul/manager/independent.py
index 65f5ca0..9da40d5 100644
--- a/zuul/manager/independent.py
+++ b/zuul/manager/independent.py
@@ -34,13 +34,13 @@
 
     def enqueueChangesAhead(self, change, quiet, ignore_requirements,
                             change_queue, history=None):
-        if history and change.number in history:
+        if history and change in history:
             # detected dependency cycle
             self.log.warn("Dependency cycle detected")
             return False
         if hasattr(change, 'number'):
             history = history or []
-            history.append(change.number)
+            history = history + [change]
         else:
             # Don't enqueue dependencies ahead of a non-change ref.
             return True
@@ -70,6 +70,9 @@
         self.log.debug("Checking for changes needed by %s:" % change)
         # Return true if okay to proceed enqueing this change,
         # false if the change should not be enqueued.
+        if (hasattr(change, 'commit_needs_changes') and
+            (change.refresh_deps or change.commit_needs_changes is None)):
+            self.updateCommitDependencies(change, None)
         if not hasattr(change, 'needs_changes'):
             self.log.debug("  %s does not support dependencies" % type(change))
             return True
diff --git a/zuul/merger/client.py b/zuul/merger/client.py
index 2614e58..c89a6fb 100644
--- a/zuul/merger/client.py
+++ b/zuul/merger/client.py
@@ -131,6 +131,15 @@
         job = self.submitJob('merger:cat', data, None, precedence)
         return job
 
+    def getFilesChanges(self, connection_name, project_name, branch,
+                        tosha=None, precedence=zuul.model.PRECEDENCE_HIGH):
+        data = dict(connection=connection_name,
+                    project=project_name,
+                    branch=branch,
+                    tosha=tosha)
+        job = self.submitJob('merger:fileschanges', data, None, precedence)
+        return job
+
     def onBuildCompleted(self, job):
         data = getJobData(job)
         merged = data.get('merged', False)
diff --git a/zuul/merger/merger.py b/zuul/merger/merger.py
index 06ec4b2..bd4ca58 100644
--- a/zuul/merger/merger.py
+++ b/zuul/merger/merger.py
@@ -314,6 +314,18 @@
                             'utf-8')
         return ret
 
+    def getFilesChanges(self, branch, tosha=None):
+        repo = self.createRepoObject()
+        files = set()
+        head = repo.heads[branch].commit
+        files.update(set(head.stats.files.keys()))
+        if tosha:
+            for cmt in head.iter_parents():
+                if cmt.hexsha == tosha:
+                    break
+                files.update(set(cmt.stats.files.keys()))
+        return list(files)
+
     def deleteRemote(self, remote):
         repo = self.createRepoObject()
         repo.delete_remote(repo.remotes[remote])
@@ -581,3 +593,8 @@
     def getFiles(self, connection_name, project_name, branch, files, dirs=[]):
         repo = self.getRepo(connection_name, project_name)
         return repo.getFiles(files, dirs, branch=branch)
+
+    def getFilesChanges(self, connection_name, project_name, branch,
+                        tosha=None):
+        repo = self.getRepo(connection_name, project_name)
+        return repo.getFilesChanges(branch, tosha)
diff --git a/zuul/merger/server.py b/zuul/merger/server.py
index 576d41e..aa04fc2 100644
--- a/zuul/merger/server.py
+++ b/zuul/merger/server.py
@@ -81,6 +81,7 @@
         self.worker.registerFunction("merger:merge")
         self.worker.registerFunction("merger:cat")
         self.worker.registerFunction("merger:refstate")
+        self.worker.registerFunction("merger:fileschanges")
 
     def stop(self):
         self.log.debug("Stopping")
@@ -117,6 +118,9 @@
                     elif job.name == 'merger:refstate':
                         self.log.debug("Got refstate job: %s" % job.unique)
                         self.refstate(job)
+                    elif job.name == 'merger:fileschanges':
+                        self.log.debug("Got fileschanges job: %s" % job.unique)
+                        self.fileschanges(job)
                     else:
                         self.log.error("Unable to handle job %s" % job.name)
                         job.sendWorkFail()
@@ -158,3 +162,12 @@
         result = dict(updated=True,
                       files=files)
         job.sendWorkComplete(json.dumps(result))
+
+    def fileschanges(self, job):
+        args = json.loads(job.arguments)
+        self.merger.updateRepo(args['connection'], args['project'])
+        files = self.merger.getFilesChanges(
+            args['connection'], args['project'], args['branch'], args['tosha'])
+        result = dict(updated=True,
+                      files=files)
+        job.sendWorkComplete(json.dumps(result))
diff --git a/zuul/model.py b/zuul/model.py
index 77770b7..29c5a9d 100644
--- a/zuul/model.py
+++ b/zuul/model.py
@@ -384,6 +384,7 @@
         self.private_ipv4 = None
         self.public_ipv6 = None
         self.connection_port = 22
+        self.connection_type = None
         self._keys = []
         self.az = None
         self.provider = None
@@ -844,6 +845,7 @@
             semaphore=None,
             attempts=3,
             final=False,
+            protected=None,
             roles=(),
             required_projects={},
             allowed_projects=None,
@@ -861,6 +863,7 @@
             inheritance_path=(),
             parent_data=None,
             description=None,
+            protected_origin=None,
         )
 
         self.inheritable_attributes = {}
@@ -1038,12 +1041,21 @@
 
         for k in self.execution_attributes:
             if (other._get(k) is not None and
-                k not in set(['final'])):
+                    k not in set(['final', 'protected'])):
                 if self.final:
                     raise Exception("Unable to modify final job %s attribute "
                                     "%s=%s with variant %s" % (
                                         repr(self), k, other._get(k),
                                         repr(other)))
+                if self.protected_origin:
+                    # this is a protected job, check origin of job definition
+                    this_origin = self.protected_origin
+                    other_origin = other.source_context.project.canonical_name
+                    if this_origin != other_origin:
+                        raise Exception("Job %s which is defined in %s is "
+                                        "protected and cannot be inherited "
+                                        "from other projects."
+                                        % (repr(self), this_origin))
                 if k not in set(['pre_run', 'run', 'post_run', 'roles',
                                  'variables', 'required_projects']):
                     # TODO(jeblair): determine if deepcopy is required
@@ -1054,6 +1066,17 @@
         if other.final != self.attributes['final']:
             self.final = other.final
 
+        # Protected may only be set to true
+        if other.protected is not None:
+            # don't allow to reset protected flag
+            if not other.protected and self.protected_origin:
+                raise Exception("Unable to reset protected attribute of job"
+                                " %s by job %s" % (
+                                    repr(self), repr(other)))
+            if not self.protected_origin:
+                self.protected_origin = \
+                    other.source_context.project.canonical_name
+
         # We must update roles before any playbook contexts
         if other._get('roles') is not None:
             self.addRoles(other.roles)
@@ -1385,6 +1408,8 @@
         build.build_set = self
 
     def removeBuild(self, build):
+        if build.job.name not in self.builds:
+            return
         self.tries[build.job.name] += 1
         del self.builds[build.job.name]
 
@@ -2100,11 +2125,28 @@
     def __init__(self, project):
         super(Change, self).__init__(project)
         self.number = None
+        # The gitweb url for browsing the change
         self.url = None
+        # URIs for this change which may appear in depends-on headers.
+        # Note this omits the scheme; i.e., is hostname/path.
+        self.uris = []
         self.patchset = None
 
-        self.needs_changes = []
-        self.needed_by_changes = []
+        # Changes that the source determined are needed due to the
+        # git DAG:
+        self.git_needs_changes = []
+        self.git_needed_by_changes = []
+
+        # Changes that the source determined are needed by backwards
+        # compatible processing of Depends-On headers (Gerrit only):
+        self.compat_needs_changes = []
+        self.compat_needed_by_changes = []
+
+        # Changes that the pipeline manager determined are needed due
+        # to Depends-On headers (all drivers):
+        self.commit_needs_changes = None
+        self.refresh_deps = False
+
         self.is_current_patchset = True
         self.can_merge = False
         self.is_merged = False
@@ -2113,6 +2155,11 @@
         self.status = None
         self.owner = None
 
+        # This may be the commit message, or it may be a cover message
+        # in the case of a PR.  Either way, it's the place where we
+        # look for depends-on headers.
+        self.message = None
+
         self.source_event = None
 
     def _id(self):
@@ -2126,8 +2173,18 @@
             return True
         return False
 
+    @property
+    def needs_changes(self):
+        return (self.git_needs_changes + self.compat_needs_changes +
+                self.commit_needs_changes)
+
+    @property
+    def needed_by_changes(self):
+        return (self.git_needed_by_changes + self.compat_needed_by_changes)
+
     def isUpdateOf(self, other):
-        if ((hasattr(other, 'number') and self.number == other.number) and
+        if (self.project == other.project and
+            (hasattr(other, 'number') and self.number == other.number) and
             (hasattr(other, 'patchset') and
              self.patchset is not None and
              other.patchset is not None and
@@ -2255,7 +2312,7 @@
 
 
 class ProjectConfig(object):
-    # Represents a project cofiguration
+    # Represents a project configuration
     def __init__(self, name, source_context=None):
         self.name = name
         # If this is a template, it will have a source_context, but
@@ -2400,7 +2457,7 @@
         r.semaphores = copy.deepcopy(self.semaphores)
         return r
 
-    def extend(self, conf, tenant=None):
+    def extend(self, conf, tenant):
         if isinstance(conf, UnparsedTenantConfig):
             self.pragmas.extend(conf.pragmas)
             self.pipelines.extend(conf.pipelines)
@@ -2408,16 +2465,14 @@
             self.project_templates.extend(conf.project_templates)
             for k, v in conf.projects.items():
                 name = k
-                # If we have the tenant add the projects to
-                # the according canonical name instead of the given project
-                # name. If it is not found, it's ok to add this to the given
-                # name. We also don't need to throw the
+                # Add the projects to the according canonical name instead of
+                # the given project name. If it is not found, it's ok to add
+                # this to the given name. We also don't need to throw the
                 # ProjectNotFoundException here as semantic validation occurs
                 # later where it will fail then.
-                if tenant is not None:
-                    trusted, project = tenant.getProject(k)
-                    if project is not None:
-                        name = project.canonical_name
+                trusted, project = tenant.getProject(k)
+                if project is not None:
+                    name = project.canonical_name
                 self.projects.setdefault(name, []).extend(v)
             self.nodesets.extend(conf.nodesets)
             self.secrets.extend(conf.secrets)
@@ -2434,7 +2489,12 @@
                 raise ConfigItemMultipleKeysError()
             key, value = list(item.items())[0]
             if key == 'project':
-                name = value['name']
+                name = value.get('name')
+                if not name:
+                    # There is no name defined so implicitly add the name
+                    # of the project where it is defined.
+                    name = value['_source_context'].project.canonical_name
+                    value['name'] = name
                 self.projects.setdefault(name, []).append(value)
             elif key == 'job':
                 self.jobs.append(value)
@@ -2643,11 +2703,11 @@
                                    repr(variant), change)
                     item.debug("Pipeline variant {variant} matched".format(
                         variant=repr(variant)), indent=2)
-            else:
-                self.log.debug("Pipeline variant %s did not match %s",
-                               repr(variant), change)
-                item.debug("Pipeline variant {variant} did not match".format(
-                    variant=repr(variant)), indent=2)
+                else:
+                    self.log.debug("Pipeline variant %s did not match %s",
+                                   repr(variant), change)
+                    item.debug("Pipeline variant {variant} did not match".
+                               format(variant=repr(variant)), indent=2)
             if not matched:
                 # A change must match at least one project pipeline
                 # job variant.
diff --git a/zuul/reporter/__init__.py b/zuul/reporter/__init__.py
index ecf8855..1bff5cb 100644
--- a/zuul/reporter/__init__.py
+++ b/zuul/reporter/__init__.py
@@ -109,12 +109,10 @@
         else:
             return self._formatItemReport(item)
 
-    def _formatItemReportJobs(self, item):
-        # Return the list of jobs portion of the report
-        ret = ''
-
+    def _getItemReportJobsFields(self, item):
+        # Extract the report elements from an item
         config = self.connection.sched.config
-
+        jobs_fields = []
         for job in item.getJobs():
             build = item.current_build_set.getBuild(job.name)
             (result, url) = item.formatJobResult(job)
@@ -147,6 +145,13 @@
             else:
                 error = ''
             name = job.name + ' '
-            ret += '- %s%s : %s%s%s%s\n' % (name, url, result, error,
-                                            elapsed, voting)
+            jobs_fields.append((name, url, result, error, elapsed, voting))
+        return jobs_fields
+
+    def _formatItemReportJobs(self, item):
+        # Return the list of jobs portion of the report
+        ret = ''
+        jobs_fields = self._getItemReportJobsFields(item)
+        for job_fields in jobs_fields:
+            ret += '- %s%s : %s%s%s%s\n' % job_fields
         return ret
diff --git a/zuul/scheduler.py b/zuul/scheduler.py
index b978979..a2e3b6e 100644
--- a/zuul/scheduler.py
+++ b/zuul/scheduler.py
@@ -823,8 +823,7 @@
         if self.statsd:
             self.log.debug("Statsd enabled")
         else:
-            self.log.debug("Statsd disabled because python statsd "
-                           "package not found")
+            self.log.debug("Statsd not configured")
         while True:
             self.log.debug("Run handler sleeping")
             self.wake_event.wait()
@@ -1089,3 +1088,25 @@
         for pipeline in tenant.layout.pipelines.values():
             pipelines.append(pipeline.formatStatusJSON(websocket_url))
         return json.dumps(data)
+
+    def onChangeUpdated(self, change):
+        """Remove stale dependency references on change update.
+
+        When a change is updated with a new patchset, other changes in
+        the system may still have a reference to the old patchset in
+        their dependencies.  Search for those (across all sources) and
+        mark that their dependencies are out of date.  This will cause
+        them to be refreshed the next time the queue processor
+        examines them.
+        """
+
+        self.log.debug("Change %s has been updated, clearing dependent "
+                       "change caches", change)
+        for source in self.connections.getSources():
+            for other_change in source.getCachedChanges():
+                if other_change.commit_needs_changes is None:
+                    continue
+                for dep in other_change.commit_needs_changes:
+                    if change.isUpdateOf(dep):
+                        other_change.refresh_deps = True
+        change.refresh_deps = True
diff --git a/zuul/source/__init__.py b/zuul/source/__init__.py
index 0396aff..00dfc9c 100644
--- a/zuul/source/__init__.py
+++ b/zuul/source/__init__.py
@@ -52,6 +52,29 @@
         """Get the change representing an event."""
 
     @abc.abstractmethod
+    def getChangeByURL(self, url):
+        """Get the change corresponding to the supplied URL.
+
+        The URL may may not correspond to this source; if it doesn't,
+        or there is no change at that URL, return None.
+
+        """
+
+    @abc.abstractmethod
+    def getChangesDependingOn(self, change, projects):
+        """Return changes which depend on changes at the supplied URIs.
+
+        Search this source for changes which depend on the supplied
+        change.  Generally the Change.uris attribute should be used to
+        perform the search, as it contains a list of URLs without the
+        scheme which represent a single change
+
+        If the projects argument is None, search across all known
+        projects.  If it is supplied, the search may optionally be
+        restricted to only those projects.
+        """
+
+    @abc.abstractmethod
     def getProjectOpenChanges(self, project):
         """Get the open changes for a project."""
 
diff --git a/zuul/web/__init__.py b/zuul/web/__init__.py
index cefc922..a98a6c8 100755
--- a/zuul/web/__init__.py
+++ b/zuul/web/__init__.py
@@ -305,6 +305,7 @@
         self.listen_port = listen_port
         self.event_loop = None
         self.term = None
+        self.server = None
         self.static_cache_expiry = static_cache_expiry
         # instanciate handlers
         self.rpc = zuul.rpcclient.RPCClient(gear_server, gear_port,