Skip to main content

Model Context Protocol

The Model Context Protocol (MCP) server provides direct, seamless access to J.P. Morgan Payments Developer Portal (PDP) documentation. Designed as an open-source solution, the MCP server enables developers to search across portal documentation, fetch specific pages in Markdown format, and discover related content—all through a unified interface. The server is built for integration with AI agents and developer tools, supporting robust search, retrieval, and recommendation capabilities to streamline your workflow.

Key features include:

  • Search documentation: Find relevant PDP documentation using keywords or topics.

  • Read documentation: Retrieve and convert documentation pages to structured Markdown, preserving headings, code blocks, lists, and tables.

  • Related content: Discover additional or alternative documentation pages for deeper context.

How it works

The following integration workflow provides a high-level overview of how to get started: 

1. Access the MCP folder in GitHub

Find the MCP resources and setup instructions in the Payments GitHub repository. The server is hosted publicly for community collaboration, with internal development managed separately. 

2. Set up your environment

Prepare your system with the required software (Python 3.10+, uv or pip) and dependencies to run the MCP server. The backend leverages mature HTTP libraries and markdown processing for reliability and performance. 

3. Integrate with your developer tool

Connect the MCP server to your preferred development environment or AI assistant (such as Copilot, Claude, or GPT) for seamless documentation access. The server is designed to serve as a backend for AI agents and IDE plugins. 

4. Launch the MCP server

Launch the MCP server to enable documentation search and retrieval capabilities. The server interfaces with the official PDP website and search API, and caches documentation resources for improved latency. 

5. Explore and develop

Enhance your workflow by running tests, configuring environment variables, and leveraging future enhancements such as prompt templates and optimized resource caching.  

For complete setup instructions, command references, and integration details, visit the GitHub repository.