Resources Blog Mercury - Externalized Dependencies

Mercury - Externalized Dependencies


In Mercury we made an attempt to bring some contemporary ideas into Maven, make Maven not a monolithic build system, but rather a lego-like construction set that allows users to create systems they need. One set of building blocks in particular - implementation of the idea that dependencies is a universal commodity, that exists outside of Maven world. And operation of resolving conflicts on those dependencies can apply to a broader range of dependencies, not just <dependencies/> tag in the POM file.

For example - it should be relatively easy to plug in an implementation to read OSGi bundle dependencies. Or read a .properties file with dependencies.

[drools]: "Drools home page"

Or - my favorite - keep dependencies in a [Drools][drools]-backed DSL file. I will try to create this implementation after we integrate Maven POM dependency reader into Mercury.

How is it done:

* Dependency trees and manipulated by **DependencyTreeBuilder** object
* DependencyTreeBuilder instance is created out of a collection of **Repositories** (to find Artifacts) and and instance of **DependencyProcessor**.
* DependencyProcessor abstraction is that externalization component for reading/processing depedencies, making the builder independent of actual dependency information storage

Example code

    DependencyTreeBuilder dtb
        = new DependencyTreeBuilder( null, null, null, reps, processor );
    MetadataTreeNode root
        = dtb.buildTree( new ArtifactBasicMetadata( "org.apache.maven:maven-core:2.0.9" ) );
    List cp
        = dtb.resolveConflicts( ArtifactScopeEnum.compile );

This example demonstrates full dependency resolution cycle:

* full tree is created in **root** variable
* compile classpath is stored in variable **cp**.

The two initialization steps required:

* create a collection of repositories in **reps**
* get an instance of DependencyProcessor into **processor**

Also interesting - resulting classpath is a list of artifact metadata objects, not artifact binaries. So in order to actually bring the binaries to local machine, we'll need one more call - to VirtualRepositoryReader, which I will describe it in a separate posting.

What's important here is the fact that operation of fetching binaries from remote repositories is separated from building dependency tree as well as from resolving conflicts. Which means that you can call all of those operations independently to construct a workflow you need for a particular application. All the metadata that was fetched in the process of creating dependency tree is cached by a **RepositoryMetadataCache** component, that complies with repository update policy.

Written by Oleg Gusakov

Oleg is a former Chief Architect at Sonatype. He now works as an engineer at TripActions, a business travel platform that empowers companies and travelers to show up and create growth.