Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      GitHub’s CEO Thomas Dohmke steps down, triggering tighter integration of company within Microsoft

      August 12, 2025

      bitHuman launches SDK for creating AI avatars

      August 12, 2025

      Designing With AI, Not Around It: Practical Advanced Techniques For Product Design Use Cases

      August 11, 2025

      Why Companies Are Investing in AI-Powered React.js Development Services in 2025

      August 11, 2025

      I found a Google Maps alternative that won’t track you or drain your battery – and it’s free

      August 12, 2025

      I tested this new AI podcast tool to see if it can beat NotebookLM – here’s how it did

      August 12, 2025

      Microsoft’s new update makes your taskbar a productivity hub – here’s how

      August 12, 2025

      Save $50 on the OnePlus Pad 3 plus get a free gift – here’s the deal

      August 12, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      Laravel Global Scopes: Automatic Query Filtering

      August 12, 2025
      Recent

      Laravel Global Scopes: Automatic Query Filtering

      August 12, 2025

      Building MCP Servers in PHP

      August 12, 2025

      Filament v4 is Stable!

      August 12, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      I Asked OpenAI’s New Open-Source AI Model to Complete a Children’s School Test — Is It Smarter Than a 10-Year-Old?

      August 12, 2025
      Recent

      I Asked OpenAI’s New Open-Source AI Model to Complete a Children’s School Test — Is It Smarter Than a 10-Year-Old?

      August 12, 2025

      Madden NFL 26 Leads This Week’s Xbox Drops—But Don’t Miss These Hidden Gems

      August 12, 2025

      ASUS G14 Bulked Up for 2025—Still Sexy, Just a Bit Chonkier

      August 12, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»How to Create Serverless AI Agents with Langbase Docs MCP Server in Minutes

    How to Create Serverless AI Agents with Langbase Docs MCP Server in Minutes

    May 6, 2025

    Building serverless AI agents has recently become a lot simpler. With the Langbase Docs MCP server, you can instantly connect AI models to Langbase documentation – making it easy to build composable, agentic AI systems with memory without complex infrastructure.

    In this guide, you’ll learn how to set up the Langbase Docs MCP server inside Cursor (an AI code editor), and build a summary AI agent that uses Langbase docs as live, on-demand context.

    Here’s what we’ll cover:

    • Prerequisites

    • What is Model Context Protocol (MCP)?

    • Anthropic’s role in launching MCP

    • Cursor AI code editor

    • What is Langbase and why is its Docs MCP server useful?

    • How to set up the Langbase Docs MCP server in Cursor?

    • How to use Langbase Docs MCP server in Cursor AI?

    • Use case: Build a summary AI Agent with Langbase Docs MCP server

    Prerequisites

    Before we begin creating the agent, you’ll need to have some things setup and some tools ready to go.

    In this tutorial, I’ll be using the following tech stack:

    • Langbase – the platform to build and deploy your serverless AI agents.

    • Langbase SDK – a TypeScript AI SDK, designed to work with JavaScript, TypeScript, Node.js, Next.js, React, and the like.

    • Cursor – An AI code editor just like VS Code.

    You’ll also need to:

    • Sign up on Langbase to get access to the API key.

    What is Model Context Protocol (MCP)?

    Model Context Protocol (MCP) is an open protocol that standardizes how applications provide external context to large language models (LLMs). With MCP, developers can connect AI models to various tools and data sources like documentation, APIs, and databases – in a clean, consistent way.

    Instead of relying solely on prompts, MCP allows LLMs to call custom tools (like documentation fetchers or API explorers) during a conversation.

    MCP General Architecture

    At its core, MCP follows a client-server architecture where a host application can connect to multiple servers.

    Here’s the general architecture of what it looks like:

    AD_4nXdjfGegMH-jHoYjgT3dRPhigOoIz8em0NyexLrfqwNEwdX7rvnbnCxfJG7nKqLk5fYcFu0_D5D8-DMb3vg0nLF4r-N8LlfH6IyFz18HjGZYlZ2J2_cq-jKq3Y6X_LPVxIz3rPs7?key=aHnkCxEY2NrPpuL4oNSIQJNY

    The Model Context Protocol architecture lets AI clients (like Claude, IDEs, and developer tools) securely connect to multiple local or remote data sources in real time. MCP clients communicate with one or more MCP servers, which act as bridges to structured data – whether from local files, databases, or remote APIs.

    This setup allows AI models to retrieve fresh, relevant context from different sources seamlessly, without embedding data directly into the model.

    Anthropic’s Role in Launching MCP

    Anthropic introduced MCP as part of their vision to make LLMs tool-augmented by default. MCP was originally built to expand Claude’s capabilities, but it’s now available more broadly and supported in developer-friendly environments like Cursor and Claude Desktop.

    By standardizing how tools integrate into LLM workflows, MCP makes it easier for developers to extend AI systems without custom plugins or API hacks.

    50ed79a0-3728-4cca-92a1-0f48ded38049

    Cursor AI Code Editor

    Cursor is a developer-first AI code editor that integrates LLMs (like Claude, GPT, and more) directly into your IDE. Cursor supports MCP, meaning you can quickly attach custom tool servers – like the Langbase Docs MCP server – and make them accessible as AI-augmented tools while you code.

    Think of Cursor as VS Code meets AI agents – with built-in support for smart tools like docs fetchers and code examples retrievers.

    What is Langbase and Why is its Docs MCP Server Useful?

    Langbase is a powerful serverless AI platform for building AI agents with memory. It helps developers build AI-powered apps and assistants by connecting LLMs directly to their data, APIs, and documentation.

    The Langbase Docs MCP Server provides access to the Langbase documentation and API reference. This server allows you to use the Langbase documentation as context for your LLMs.

    By connecting this server to Cursor (or any MCP-supported IDE), you can make Langbase documentation available to your AI agents on demand. This means less context-switching, faster workflows, and smarter assistance when building serverless agentic applications.

    How to Set Up the Langbase Docs MCP Server in Cursor

    Let’s walk through setting up the server step-by-step.

    1. Open Cursor Settings

    Launch Cursor and open Settings. From the left sidebar, select MCP.

    2. Add a New MCP Server

    Click the yellow + Add new global MCP server button.

    Add new global MCP server

    3. Configure the Langbase Docs MCP Server

    Paste the following configuration into the mcp.json file:

    {
        "mcpServers": {
            "Langbase": {
            "command": "npx",
            "args": ["@langbase/cli","docs-mcp-server"]
            }
        }
    }
    

    4. Start the Langbase Docs MCP Server

    In your terminal, run:

    pnpm add @langbase/cli
    

    And then run this command:

    pnpm dlx @langbase/cli docs-mcp-server
    

    5. Enable the MCP Server in Cursor

    In the MCP settings, make sure the Langbase server is toggled to Enabled.

    Langbase server toggled to "Enabled" in Cursor

    How to Use Langbase Docs MCP Server in Cursor AI

    Once everything’s all set up, Cursor’s AI agent can now call Langbase docs tools like:

    • docs_route_finder

    • sdk_documentation_fetcher

    • examples_tool

    • guide_tool

    • api_reference_tool

    For example, you can ask the Cursor agent:

    “Show me the API reference for Langbase Memory”
     or
     “Find a code example of creating an AI agent pipe in Langbase”
    

    The AI will use the Docs MCP server to fetch precise documentation snippets – directly inside Cursor.

    Use Case: Build a Summary AI Agent with Langbase Docs MCP Server

    Let’s build a summary agent that summarizes context using the Langbase SDK, powered by the Langbase Docs MCP server inside the Cursor AI code editor.

    1. Open an empty folder in Cursor and launch the chat panel (Cmd+Shift+I on Mac or Ctrl+Shift+I on Windows).

    2. Switch to Agent mode from the mode selector and pick your preferred LLM (we’ll use Claude 3.5 Sonnet for this demo).

    3. In the chat input, enter the following prompt:
      “In this directory, using Langbase SDK, create the summary pipe agent. Use TypeScript and pnpm to run the agent in the terminal.“

    4. Cursor will automatically invoke MCP calls, generate the required files and code using Langbase Docs as context, and suggest changes. Accept the changes, and your summary agent will be ready. You can run the agent using the commands provided by Cursor and view the results.

    Here’s a demo video of creating this summary agent with a single prompt and Langbase Docs MCP server:

    By combining Langbase’s Docs MCP server with Cursor AI, you’ve learned how to build serverless AI agents in minutes – all without leaving your IDE.

    If you’re building AI agents, tools, or apps with Langbase, this is one of the fastest ways to simplify your development process.

    Happy building! 🚀

    Connect with me by 🙌:

    • Subscribing to my YouTube Channel. If you are willing to learn about AI and agents.

    • Subscribing to my free newsletter “The Agentic Engineer” where I share all the latest AI and agents news/trends/jobs and much more.

    • Follow me on X (Twitter).

    Source: freeCodeCamp Programming Tutorials: Python, JavaScript, Git & More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleCVE-2025-47419 – Crestron Automate VX Insecure Communication Vulnerability
    Next Article CVE-2025-46573 – OpenSAMLPassport-WSFed Impersonation Vulnerability

    Related Posts

    Development

    Laravel Global Scopes: Automatic Query Filtering

    August 12, 2025
    Development

    Building MCP Servers in PHP

    August 12, 2025
    Leave A Reply Cancel Reply

    For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

    Continue Reading

    Top 8 Smartphones Under ₹30,000 in India (2025) – Best Deals on Amazon!

    Development

    I tested these new Shokz clip-on earbuds, and they give Bose’s Ultra Open a run for their money

    News & Updates

    Packet is the Linux App You Didn’t Know You Needed for Fast Android File Transfers

    Linux

    Modern Node.js Patterns for 2025

    Web Development

    Highlights

    CVE-2025-45146 – ModelCache for LLM Deserialization Vulnerability

    August 11, 2025

    CVE ID : CVE-2025-45146

    Published : Aug. 11, 2025, 4:15 p.m. | 8 hours, 7 minutes ago

    Description : ModelCache for LLM through v0.2.0 was discovered to contain an deserialization vulnerability via the component /manager/data_manager.py. This vulnerability allows attackers to execute arbitrary code via supplying crafted data.

    Severity: 9.8 | CRITICAL

    Visit the link for more details, such as CVSS details, affected products, timeline, and more…

    In MCP era API discoverability is now more important than ever

    June 5, 2025

    PoC Exploit Released for High-Severity Git CLI Arbitrary File Write Vulnerability

    July 15, 2025

    MMAU: A Holistic Benchmark of Agent Capabilities Across Diverse Domains

    July 23, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.