{"id":167,"date":"2025-08-07T18:07:17","date_gmt":"2025-08-07T18:07:17","guid":{"rendered":"https:\/\/aiinfrahub.com\/about-us\/?p=167"},"modified":"2025-08-07T18:07:17","modified_gmt":"2025-08-07T18:07:17","slug":"aws-strand-agent-integration-with-researcher-mcp-server","status":"publish","type":"post","link":"https:\/\/aiinfrahub.com\/about-us\/aws-strand-agent-integration-with-researcher-mcp-server\/","title":{"rendered":"AWS Strand Agent &#8211; integration with Researcher MCP server"},"content":{"rendered":"\n<p>This project demonstrates the power of&nbsp;<strong>AWS strand agents<\/strong>&nbsp;and how quickly you can build intelligent, context-aware applications. AWS provides a robust foundation for creating agents that can understand, reason, and act on complex data &#8211; and with&nbsp;<strong>Model Context Protocol (MCP)<\/strong>&nbsp;integration, you can extend these capabilities seamlessly.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Why AWS Strand Agents?<a href=\"https:\/\/github.com\/juggarnautss\/Strand_Agent_MCP_Server\/blob\/main\/README.md#why-aws-strand-agents\"><\/a><\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Model Driven Orchestration<\/strong>: Strands leverages model reasoning to plan, orchestrate tasks, and reflect on goals<\/li>\n\n\n\n<li><strong>Model and Provider Agnostic<\/strong>: Work with any LLM provider &#8211; Amazon Bedrock, OpenAI, Anthropic, local models. Switch providers without changing your code.<\/li>\n\n\n\n<li><strong>Simple MultiAgent Primitives<\/strong>: Simple primitives for handoffs, swarms, and graph workflows with built-in support for A2A<\/li>\n\n\n\n<li><strong>Best in-class AWS integration<\/strong>: Native tools for AWS service interactions. Deploy easily into EKS, Lambda, EC2, and more. Native MCP tool integration.<\/li>\n<\/ul>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What is an MCP Server?<\/h2>\n\n\n\n<p>An&nbsp;<strong>MCP (Model Context Protocol) Server<\/strong>&nbsp;is a lightweight, modular service that exposes tools or functions in a standard interface so they can be used by AI agents, workflows, or external systems. Built on the FastMCP framework, it allows developers to quickly register Python functions as callable tools over stdin, REST, or other transports \u2014 enabling LLM-driven automation, tool use, and orchestration.<\/p>\n\n\n\n<p><strong>NOTE:<\/strong> To learn more about developing a MCP server from scratch, check out &#8211; <a href=\"https:\/\/aiinfrahub.com\/about-us\/building-an-mcp-server-using-fastmcp-and-arxiv\/\">https:\/\/aiinfrahub.com\/about-us\/building-an-mcp-server-using-fastmcp-and-arxiv\/<\/a><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Project Goal<\/h2>\n\n\n\n<p>The goal of this project is to:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Let a <strong>Strand Agent<\/strong> query , list and brief about research papers<\/li>\n\n\n\n<li>Let the Strand Agent leverage model to do reasoning and call the appropriate tool<\/li>\n\n\n\n<li>Let the Strand Agent fetch the tools output(context) and feed the context to LLM to build response<\/li>\n\n\n\n<li>Demonstrate how LLM calls the Researcher tools:\n<ul class=\"wp-block-list\">\n<li>search_arxiv<\/li>\n\n\n\n<li>get_paper_info<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n\n\n\n<p>This is useful in <strong>research automation<\/strong>, <strong>literature reviews<\/strong>, or building <strong>academic copilots<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Architecture<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"870\" height=\"608\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-14.png\" alt=\"\" class=\"wp-image-176\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-14.png 870w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-14-300x210.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-14-768x537.png 768w\" sizes=\"auto, (max-width: 870px) 100vw, 870px\" \/><\/figure>\n\n\n\n<div style=\"height:34px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 class=\"wp-block-heading\">MCP Chatbot Workflow<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"866\" height=\"594\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-16.png\" alt=\"\" class=\"wp-image-179\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-16.png 866w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-16-300x206.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-16-768x527.png 768w\" sizes=\"auto, (max-width: 866px) 100vw, 866px\" \/><\/figure>\n\n\n\n<div style=\"height:34px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Imports and Configuration<\/h3>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"926\" height=\"406\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-19.png\" alt=\"\" class=\"wp-image-184\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-19.png 926w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-19-300x132.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-19-768x337.png 768w\" sizes=\"auto, (max-width: 926px) 100vw, 926px\" \/><\/figure>\n\n\n\n<div style=\"height:34px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">MCP Client Setup<\/h3>\n\n\n\n<p>This wraps your FastMCP server (<code>research_server.py<\/code>) as a subprocess via <code>uv<\/code>, using <strong><code>stdio<\/code> transport<\/strong>. It returns a <code>MCPClient<\/code> instance that can:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Discover tools,<\/li>\n\n\n\n<li>Execute them when the agent calls.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"888\" height=\"265\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-20.png\" alt=\"\" class=\"wp-image-185\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-20.png 888w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-20-300x90.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-20-768x229.png 768w\" sizes=\"auto, (max-width: 888px) 100vw, 888px\" \/><\/figure>\n\n\n\n<div style=\"height:34px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Researcher server tool discovery<\/h3>\n\n\n\n<p>This <strong>connects to the server<\/strong> and discovers available <code>@mcp.tool()<\/code> functions exposed by <code>research_server.py<\/code><\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"915\" height=\"417\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-21.png\" alt=\"\" class=\"wp-image-186\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-21.png 915w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-21-300x137.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-21-768x350.png 768w\" sizes=\"auto, (max-width: 915px) 100vw, 915px\" \/><\/figure>\n\n\n\n<div style=\"height:34px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">LLM Initialization<\/h3>\n\n\n\n<p>This will set up Bedrock LLM backend. We can control temperature, max_tokens etc<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"865\" height=\"226\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-22.png\" alt=\"\" class=\"wp-image-187\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-22.png 865w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-22-300x78.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-22-768x201.png 768w\" sizes=\"auto, (max-width: 865px) 100vw, 865px\" \/><\/figure>\n\n\n\n<div style=\"height:34px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Agent creation with tools<\/h3>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"897\" height=\"234\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-23.png\" alt=\"\" class=\"wp-image-188\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-23.png 897w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-23-300x78.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-23-768x200.png 768w\" sizes=\"auto, (max-width: 897px) 100vw, 897px\" \/><\/figure>\n\n\n\n<p>The agent is now:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>LLM-powered<\/li>\n\n\n\n<li>Tool-augmented (via MCP)<\/li>\n\n\n\n<li>Context-aware (with a focused system prompt).<\/li>\n<\/ul>\n\n\n\n<p><strong>Note<\/strong>: The prompt enforces <strong>tool usage and citation discipline<\/strong>, so the agent only responds based on real documents, no assumption.<\/p>\n\n\n\n<div style=\"height:34px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Interactive chat loop<\/h3>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"884\" height=\"346\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-24.png\" alt=\"\" class=\"wp-image-189\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-24.png 884w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-24-300x117.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-24-768x301.png 768w\" sizes=\"auto, (max-width: 884px) 100vw, 884px\" \/><\/figure>\n\n\n\n<p>CLI-style interface for easy testing, Sends user queries to the <code>Agent<\/code>, which:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Parses intent,<\/li>\n\n\n\n<li>Selects tools (from the MCP server),<\/li>\n\n\n\n<li>Uses the LLM to formulate a response.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Running the Agent &#8211; mcp_chatbot<\/h2>\n\n\n\n<pre class=\"wp-block-preformatted\">git clone https:\/\/github.com\/juggarnautss\/Strand_Agent_MCP_Server.git<br>uv init<br>uv pip install -r requirements.txt<br>python mcp_chatbot.py<\/pre>\n\n\n\n<p><\/p>\n\n\n\n<p><strong>CLI screen of chatbot:<\/strong><\/p>\n\n\n\n<p>We can see tools of mcp server are getting listed as part of chatbot initialization.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"382\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-11-1024x382.png\" alt=\"\" class=\"wp-image-169\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-11-1024x382.png 1024w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-11-300x112.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-11-768x286.png 768w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-11.png 1228w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:46px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><strong>Researcher server:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"69\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-25-1024x69.png\" alt=\"\" class=\"wp-image-192\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-25-1024x69.png 1024w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-25-300x20.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-25-768x52.png 768w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-25.png 1371w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:45px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Testing the Agent &#8211; mcp_chatbot<\/h2>\n\n\n\n<pre class=\"wp-block-code\"><code><strong>User:<\/strong> List two research papers on Artificial Intelligence<\/code><\/pre>\n\n\n\n<p>We see that bedrock backend LLM has reasoned and selected the mcp tool &#8220;<strong>search_arxiv<\/strong>()&#8221; to fetch paper ids ie Strands leverages model reasoning to plan, orchestrate tasks, and reflect on goals <\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"219\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-12-1024x219.png\" alt=\"\" class=\"wp-image-170\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-12-1024x219.png 1024w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-12-300x64.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-12-768x164.png 768w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-12.png 1218w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<pre class=\"wp-block-code\"><code><strong>User:<\/strong> Paper ID: 2304.02924v1<\/code><\/pre>\n\n\n\n<p>LLM now auto-selecting the tool &#8220;<strong>get_paper_info()<\/strong>&#8221; to get the paper details.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"204\" src=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-13-1024x204.png\" alt=\"\" class=\"wp-image-171\" srcset=\"https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-13-1024x204.png 1024w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-13-300x60.png 300w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-13-768x153.png 768w, https:\/\/aiinfrahub.com\/wp-content\/uploads\/2025\/08\/image-13.png 1143w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:49px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Conclusion<\/h2>\n\n\n\n<p>In this blog, we explored how to build an intelligent, tool-augmented research assistant using <strong>Amazon Strands Agents<\/strong> and the <strong>Model Context Protocol (MCP)<\/strong>. <\/p>\n\n\n\n<p>By connecting a bedrock model as backend to an agent and powered with custom MCP server, we enabled the agent to reason and interact with real research tools in a structured, reliable way. <\/p>\n\n\n\n<p>This setup demonstrates the core strength of Strands Agents\u2014<strong>model driven orchestration<\/strong>\u2014allowing developers to build modular, extensible AI systems that can scale across use cases like research, DevOps, and more.<\/p>\n\n\n\n<div style=\"height:45px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">References<\/h2>\n\n\n\n<p><a href=\"https:\/\/strandsagents.com\/latest\/\" target=\"_blank\" rel=\"noreferrer noopener\">Strands Agents sdk<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/aiinfrahub.com\/about-us\/building-an-mcp-server-using-fastmcp-and-arxiv\/\" target=\"_blank\" rel=\"noreferrer noopener\">Building an MCP Server using FastMCP and arXiv<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/github.com\/juggarnautss\/Strand_Agent_MCP_Server\/tree\/main\" target=\"_blank\" rel=\"noreferrer noopener\">Github: Strand Agent MCP Server<\/a><\/p>\n\n\n\n<div style=\"height:59px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>This project demonstrates the power of&nbsp;AWS strand agents&nbsp;and how quickly you can build intelligent, context-aware applications. AWS provides a robust foundation for creating agents that can understand, reason, and act on complex data &#8211; and with&nbsp;Model Context Protocol (MCP)&nbsp;integration, you can extend these capabilities seamlessly. Why AWS Strand Agents? What is an MCP Server? An&nbsp;MCP &#8230; <a title=\"AWS Strand Agent &#8211; integration with Researcher MCP server\" class=\"read-more\" href=\"https:\/\/aiinfrahub.com\/about-us\/aws-strand-agent-integration-with-researcher-mcp-server\/\" aria-label=\"Read more about AWS Strand Agent &#8211; integration with Researcher MCP server\">Read more<\/a><\/p>\n","protected":false},"author":1,"featured_media":174,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[14],"class_list":["post-167","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-agenticai","tag-aws-strandssdk-strandagents-awsbedrock-agenticai-aiagents-autonomousagents-modelcontextprotocol-mcp-fastmcp"],"_links":{"self":[{"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/posts\/167","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/comments?post=167"}],"version-history":[{"count":15,"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/posts\/167\/revisions"}],"predecessor-version":[{"id":200,"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/posts\/167\/revisions\/200"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/media\/174"}],"wp:attachment":[{"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/media?parent=167"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/categories?post=167"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/aiinfrahub.com\/about-us\/wp-json\/wp\/v2\/tags?post=167"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}