Building Music MCP: A Claude Desktop Connector for Apple Music

5 min read
Apple Music interface with Claude Desktop connector illustration

TL;DR: Inspired by Claude Desktop’s MCP connectors, I built a Model Context Protocol server that lets Claude control Apple Music on macOS. Used Opus 4 for planning, Cursor for implementation, and the MCP itself for debugging. The result? A fully functional npm package that brings natural music control to AI conversations.

The Spark of Inspiration

While exploring Claude Desktop’s capabilities, I discovered its MCP (Model Context Protocol) connectors—particularly the Notes app integration. The thought struck me: why not do the same for Apple Music? If Claude can manage my notes, why can’t it DJ my coding sessions?

This wasn’t just about building another tool; it was about creating a more natural way to interact with music while staying in the flow of AI-assisted development.

The Planning Phase: Opus 4 Takes the Lead

I knew I needed a solid foundation, so I turned to Claude Desktop with Opus 4 for the initial planning. Having studied @steipete’s macOS automator MCP and his excellent MCP best practices article, I had the perfect reference material.

The conversation with Opus 4 was iterative and thorough. I fed it the existing project structure and asked it to design a comprehensive MCP server for Apple Music control. After several rounds of refinement, we had a detailed architectural plan that included:

  • Core MCP tools for playback control, library search, and playlist management
  • AppleScript automation for seamless Music app integration
  • TypeScript implementation with proper error handling
  • Robust queue management for advanced music control
  • NPM packaging for easy distribution

Implementation: Cursor to the Rescue

With my Opus 4 tokens exhausted (the price of thoroughness!), I switched gears to Cursor. I pasted the entire plan and asked it to implement the music MCP server. The handoff was seamless—Cursor understood the architecture and began coding immediately.

What followed was a fascinating development process. Cursor methodically worked through each component, starting with the core MCP server structure and tool registration system. It crafted a comprehensive collection of AppleScript files for Apple Music automation, each handling specific aspects like playback control, library search, and playlist management.

The implementation process was surprisingly smooth. Cursor translated the architectural plan into clean TypeScript code, creating robust error handling and user-friendly messages throughout. It built intelligent search functionality that could find tracks across artists, albums, and songs, while also implementing advanced queue management that went beyond basic play/pause controls.

One of the most impressive aspects was how Cursor maintained consistency across all the different MCP tools while ensuring each had its own focused responsibility. The playback controls felt natural, the library management was intuitive, and the playlist operations worked exactly as you’d expect from a modern music application.

The Testing Adventure: MCP Debugging Itself

Here’s where things got really interesting. I installed the MCP server in Cursor and asked it to test the implementation. What followed was a fascinating display of AI-powered debugging:

  • Search issues: The MCP found problems with library search functionality
  • Queue management bugs: Issues with adding tracks to queues were identified
  • Real-time fixes: Cursor debugged and fixed the MCP server using the MCP server itself

Watching an AI assistant test, debug, and fix its own tool was genuinely mind-blowing. Good times to be alive, indeed.

Publishing and Distribution

The final step was making the tool accessible to everyone. I packaged it as an npm module and published it as @pedrocid/music-mcp. To use it with Claude Desktop, you can add this configuration to your MCP settings:

{
  "mcpServers": {
    "music-mcp": {
      "command": "npx",
      "args": ["@pedrocid/music-mcp@latest"]
    }
  }
}

Once configured, Claude Desktop can naturally control your Apple Music through simple conversation, bringing seamless music control to your AI-assisted workflows.

Real-World Usage

Once configured, the music MCP enables natural conversations like:

  • “Play my music and set the volume to 50%”
  • “What song is currently playing?”
  • “Create a playlist called ‘Road Trip’ and add some upbeat songs”
  • “Add ‘Bohemian Rhapsody’ to play next”
  • “Show me what’s in my up next queue”

Conclusion: The Joy of Vibe-Coded Projects

This project was pure vibe-coding—a side project driven by curiosity and the joy of building something genuinely useful. The entire development process, from Claude Desktop planning to Cursor implementation to MCP self-debugging, showcased the incredible potential of AI-assisted development.

The music MCP server isn’t just a tool; it’s a glimpse into a future where AI assistants seamlessly integrate with our digital environments, making technology feel more natural and intuitive.

Want to try it? Check out the music-mcp repository or install it directly with npx @pedrocid/music-mcp@latest.

The future of human-AI collaboration is here, and it sounds pretty good. 🎵