AI Repos for Marketers & Content Creators

Open-source AI tools for content creation, writing automation, and marketing workflows.

Repositories

No repositories found for this role yet.

MCP Servers

Memory

The Memory MCP Server is an official Model Context Protocol reference server maintained by Anthropic that gives AI assistants persistent memory across conversations using a local knowledge graph. Without persistent memory, every conversation with an AI assistant starts from zero -- the model has no recollection of prior interactions, user preferences, or established context. The Memory server solves this by maintaining a structured knowledge graph stored as a local JSONL file that the assistant can read from and write to during any session. The knowledge graph is built on three core primitives. Entities are the primary nodes representing people, organizations, projects, concepts, or any other named object. Each entity has a unique name, a type classification, and a collection of observations. Relations are directed, labeled connections between entities stored in active voice (for example, "John_Smith works_at Anthropic"). Observations are discrete, atomic pieces of information attached to entities -- individual facts that can be independently added or removed without affecting other stored knowledge. The server exposes nine MCP tools that provide full CRUD operations over the knowledge graph: create_entities and create_relations for building the graph, add_observations for appending new facts to existing entities, delete_entities, delete_relations, and delete_observations for pruning outdated information, read_graph for retrieving the complete graph structure, search_nodes for querying entities by name, type, or observation content, and open_nodes for retrieving specific entities with their full context and interconnections. Storage uses a simple JSONL file format, making the knowledge base human-readable, version-controllable, and trivially portable. The storage path defaults to memory.jsonl in the working directory but can be customized via the MEMORY_FILE_PATH environment variable. Installation is straightforward through npx or Docker, and the server integrates natively with Claude Desktop, Claude Code, VS Code, Cursor, and other MCP-compatible clients. With over 44,000 weekly npm downloads and backing from the 80,000-star modelcontextprotocol/servers repository, the Memory server has become one of the most widely adopted MCP servers in the ecosystem.

81,024 stars

Atlassian MCP Server

An MCP server that connects AI assistants to Jira and Confluence — the two most widely-used project management and knowledge management tools in enterprise software. Ask your AI to search issues, create tickets, update sprint status, query Confluence pages, and manage your entire Atlassian workspace through natural language.

4,608 stars

Notion MCP Server

Official first-party MCP server from Notion Labs that gives AI assistants full read and write access to Notion workspaces — create pages, query databases, manage comments, and search across connected tools including Slack, Google Drive, and Jira.

4,038 stars

Linear MCP Server

Turn Claude into your Linear command center. This MCP server gives LLMs full CRUD on Linear issues, projects, and comments through a standardized Model Context Protocol interface. You're in a standup and someone asks "What's blocking the auth epic?" Instead of opening Linear and filtering, you ask Claude—it queries the Linear API via this server and returns a clear list with links and assignees. Or you're drafting a sprint: "Create 5 issues under the Q2 roadmap, assign frontend to Sarah and backend to Mike." One prompt, and the server creates the issues with the right project, labels, and assignees. The server supports both API key and OAuth authentication, so you can run it locally with a personal API key or hook it into your team's Linear workspace. Configuration fits the usual MCP clients: Cline, Claude Desktop, Cursor. Set LINEAR_API_KEY (or your OAuth flow), add the server to your MCP config, and you're done. No custom middleware or proxy—just npx and your key. Use it to triage issues ("List all high-priority bugs in Backend"), bulk-update labels, search issues by body or title, and keep project descriptions in sync with what you tell Claude. Pair it with a GitHub MCP server and you get issue-to-PR pipelines: "Create a Linear issue for this bug and open a branch." If you live in Linear and want your AI to live there too, this server is the bridge.

1 stars

More for Marketers

Weekly AI Digest