MCP Server LogoMCP Server
ホームカテゴリディレクトリ投稿する
投稿する
ホームカテゴリディレクトリ投稿する
投稿する

MCPサーバー

MCPサーバーのリスト、Awesome MCPサーバーとClaude MCP統合を含む。AIの能力を強化するためのMCPサーバーを検索して発見します。

お問い合わせ

[email protected]

MCPサーバーについて

プライバシーポリシー利用規約

リソース

モデルコンテキストプロトコルMCPスターターガイドClaude MCPサーバー

コミュニティ

GitHub

© 2025 mcpserver.cc © 2025 MCPサーバー. 全著作権所有.

プライバシーポリシー利用規約
  1. Home
  2. /Categories
  3. /Data & Storage
  4. /OpenMemory MCP
OpenMemory MCP

OpenMemory MCP

作成者 mem0ai•a day ago
サイトを訪問する

Memory for AI Agents; Announcing OpenMemory MCP - local and secure memory management.

Data & Storage
aimemory-managementchatbotslong-term-memory

Mem0 - The Memory Layer for Personalized AI

mem0ai%2Fmem0 | Trendshift

Learn more · Join Discord · Demo · OpenMemory

Mem0 Discord Mem0 PyPI - Downloads GitHub commit activity Package version Npm package Y Combinator S24

📄 Building Production-Ready AI Agents with Scalable Long-Term Memory →

⚡ +26% Accuracy vs. OpenAI Memory • 🚀 91% Faster • 💰 90% Fewer Tokens

🔥 Research Highlights

  • +26% Accuracy over OpenAI Memory on the LOCOMO benchmark
  • 91% Faster Responses than full-context, ensuring low-latency at scale
  • 90% Lower Token Usage than full-context, cutting costs without compromise
  • Read the full paper

Introduction

Mem0 (“mem-zero”) enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. It remembers user preferences, adapts to individual needs, and continuously learns over time—ideal for customer support chatbots, AI assistants, and autonomous systems.

Key Features & Use Cases

Core Capabilities:

  • Multi-Level Memory: Seamlessly retains User, Session, and Agent state with adaptive personalization
  • Developer-Friendly: Intuitive API, cross-platform SDKs, and a fully managed service option

Applications:

  • AI Assistants: Consistent, context-rich conversations
  • Customer Support: Recall past tickets and user history for tailored help
  • Healthcare: Track patient preferences and history for personalized care
  • Productivity & Gaming: Adaptive workflows and environments based on user behavior

🚀 Quickstart Guide

Choose between our hosted platform or self-hosted package:

Hosted Platform

Get up and running in minutes with automatic updates, analytics, and enterprise security.

  1. Sign up on Mem0 Platform
  2. Embed the memory layer via SDK or API keys

Self-Hosted (Open Source)

Install the sdk via pip:

pip install mem0ai

Install sdk via npm:

npm install mem0ai

Basic Usage

Mem0 requires an LLM to function, with gpt-4o-mini from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our Supported LLMs documentation.

First step is to instantiate the memory:

from openai import OpenAI
from mem0 import Memory

openai_client = OpenAI()
memory = Memory()

def chat_with_memories(message: str, user_id: str = "default_user") -> str:
    # Retrieve relevant memories
    relevant_memories = memory.search(query=message, user_id=user_id, limit=3)
    memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories["results"])

    # Generate Assistant response
    system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"
    messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
    response = openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
    assistant_response = response.choices[0].message.content

    # Create new memories from the conversation
    messages.append({"role": "assistant", "content": assistant_response})
    memory.add(messages, user_id=user_id)

    return assistant_response

def main():
    print("Chat with AI (type 'exit' to quit)")
    while True:
        user_input = input("You: ").strip()
        if user_input.lower() == 'exit':
            print("Goodbye!")
            break
        print(f"AI: {chat_with_memories(user_input)}")

if __name__ == "__main__":
    main()

For detailed integration steps, see the Quickstart and API Reference.

🔗 Integrations & Demos

  • ChatGPT with Memory: Personalized chat powered by Mem0 (Live Demo)
  • Browser Extension: Store memories across ChatGPT, Perplexity, and Claude (Chrome Extension)
  • Langgraph Support: Build a customer bot with Langgraph + Mem0 (Guide)
  • CrewAI Integration: Tailor CrewAI outputs with Mem0 (Example)

📚 Documentation & Support

  • Full docs: https://docs.mem0.ai
  • Community: Discord · Twitter
  • Contact: [email protected]

Citation

We now have a paper you can cite:

@article{mem0,
  title={Mem0: Building Production-Ready AI Agents with Scalable Long-Term Memory},
  author={Chhikara, Prateek and Khant, Dev and Aryan, Saket and Singh, Taranjeet and Yadav, Deshraj},
  journal={arXiv preprint arXiv:2504.19413},
  year={2025}
}

⚖️ License

Apache 2.0 — see the LICENSE file for details.

前提条件

  • •サーバーのドメインに精通している
  • •関連技術の基本的な理解
  • •Data & Storageの知識

おすすめのサーバー

Mcp Server Toolhouse

Mcp Server Toolhouse

Mcp Server Aidd

Mcp Server Aidd

Obsidian Mcp Rest

Obsidian Mcp Rest

An MCP server implementation for accessing Obsidian via local REST API

もっと見る → →

詳細

作成日

June 15, 2025

最終更新日

June 15, 2025

カテゴリー

Data & Storage

作成者

mem0ai

シェアする

もっと見る

Mcp Server Datadog

Mcp Server Datadog

Llm Context.py

Llm Context.py

Share code with LLMs via Model Context Protocol or clipboard. Rule-based customization enables easy switching between different tasks (like code review and documentation). Includes smart code outlining.

Mcp_pdf_forms

Mcp_pdf_forms

Mcp Wolfram Alpha

Mcp Wolfram Alpha

Connect your chat repl to wolfram alpha computational intelligence