Most of the RAG systems running in today’s production environments are agent-based to some extent. The implementation of the agent depends on the specific use case.
1. When we package multiple data sources, there will inevitably be a certain level of proxy involved, at least during the data source selection and retrieval stages.
2. This is how MCP enriches the evolutionary process of your Agentic RAG system in such scenarios:
• User Query Analysis: We pass the original user query to an LLM-based agent for analysis.
➡️ The original query may be rewritten, sometimes multiple times, to create a single or multiple queries to be passed through the process.
➡️ The agent determines whether additional data sources are needed to answer the query.
• If additional data is required, a retrieval step will be triggered. We have access to various types of data, such as real-time user data, internal documents that may be of interest to the user, and publicly available data on the web.
Intervention Points of MCP:
✅ Each data domain can manage its own MCP server and expose specific rules for data usage.
✅ Security and compliance can be ensured at the server level of each domain.
✅ New data domains can be easily added to the MCP server pool in a standardized manner without rewriting the proxy, thereby decoupling the evolution of the system in terms of procedural, episodic, and semantic memory.
✅ Platform builders can expose their data to external consumers in a standardized way, making it easy to access data on the network.
✅ AI engineers can continue to focus on the structure of the agent.
3. If no additional data is required, we attempt to directly combine answers (or multiple answers or a set of operations) through the LLM.
4. Then, we analyze, summarize, and evaluate the correctness and relevance of the answer:
➡️ If the agent determines that the answer is satisfactory, it will be returned to the user.
➡️ If the agent decides that the answer needs improvement, we attempt to rephrase the user’s query and repeat the generation loop.