News and Notes from the Makers of Nexus | Sonatype Blog

Sonatype Dependency Management MCP Server Now Live in OSS MCP Registry

Written by Crystal Derakhshan | October 21, 2025

AI-Assisted Coding Tools Are Still Maturing?

The last 18 months have seen explosive adoption of AI copilots and coding agents. They've gone from experimental novelties to trusted accelerators, with millions of developers now weaving them into their daily workflows.

At their core, these tools are powered by large language models (LLMs) — the same engines behind chatbots and generative AI. While LLMs handle code generation effectively, they struggle with a critical task.

LLMs are excellent code autocomplete engines, but they are mediocre at choosing dependencies. Rather than prioritize recent, secure, and high quality versions, these engines are more likely to choose whatever was the latest version at the point they were trained, or choose the dependency version that is mentioned most frequently in sample code. Additionally, LLMs have a tendency to hallucinate, which creates typo-squatting risk. Tell Copilot to build a weather app, and you might get a library that's outdated, has security issues, or has a problematic license. But what if the AI could check with a trusted dependency expert before writing that code?

That's where Sonatype Dependency Management MCP server comes in, and it's now available in the OSS MCP Registry.

What Is an MCP Server?

Model Context Protocol (MCP) is a new open standard that allows AI models and agents to securely interact with external tools. Think of it as an integration layer designed for AI, not humans.

Each MCP server exposes a set of capabilities that AI can call on and use. It gives the LLM "context" to take action, and make better outputs. Published by Anthropic (the makers of Claude), MCP was introduced in late 2024 as an open protocol, not a proprietary API, so that any AI assistant (Claude, Copilot, Cursor, etc.) can use it.

Unlike other APIs, MCP servers are built for LLMs, with structured schemas, discovery layers, and context-passing designed so that AI systems can "understand" what a tool can do.

MCP vs API

MCP, AI Assisted Coding, and Dependency Data: A Perfect Match

Shifting left now applies to AI too.

The Sonatype MCP server ensures that developers avoid risky dependencies by acting as a middleware layer between AI coding assistants and Sonatype's data. 

The MCP provides guidance on which open source dependencies to pull and what versions are safest/newest, and then the AI uses these dependencies in its generated code. AI is great at creating code, but it's missing a lot of the contextual information about open source dependencies. Social Media

That's where the magic of the Sonatype Dependency Management MCP server comes in. It allows AI coding assistants to access the depth of knowledge of the world's leading software composition analysis (SCA) tool. The result is faster, higher quality outputs and safer, more resilient applications.

How It Works: A Developer Experience Example

To leverage Sonatype's Dependency Management MCP server, simply follow these steps for your AI coding assistant or AI-enabled IDE:

  1. Make the AI / IDE aware of the MCP server. This is typically done by providing a JSON-formatted configuration block, instructing the AI on how to connect to the MCP server.

  2. Add a system prompt instructing the AI to consult the Sonatype MCP server when making dependency decisions. An example prompt is: "Use the sonatype MCP server to figure out the right version to use whenever adding dependencies to the project."

  3. Vibe-code as before, but now with the benefit of Sonatype's open source intelligence. The AI will automatically consult Sonatype's data whenever adding dependencies, ensuring that your AI-generated code is built on top of modern, secure open source components.

Check out the configuration guide here for full details.

Check out the Sonatype Dependency Management MCP server today, and let your AI coding assistant write code with confidence, backed by Sonatype's world-class open source intelligence.