Claude-code-minoan academic-research

Search academic papers, build literature reviews, and synthesize research findings — combines Exa MCP (research_paper category, arxiv filtering) with arxiv-mcp-server for paper discovery, download, and deep analysis. Triggers on academic paper, literature review, research synthesis, arxiv, find papers, scholarly search.

install
source · Clone the upstream repo
git clone https://github.com/tdimino/claude-code-minoan
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/tdimino/claude-code-minoan "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/research/academic-research" ~/.claude/skills/tdimino-claude-code-minoan-academic-research && rm -rf "$T"
manifest: skills/research/academic-research/SKILL.md
source content

Academic Research

This skill provides comprehensive guidance for academic paper search, literature reviews, and research synthesis using Exa MCP and arxiv-mcp-server.

When to Use This Skill

  • Searching for academic papers on a topic
  • Conducting literature reviews
  • Finding papers by specific authors
  • Discovering recent research in a field
  • Downloading and analyzing arXiv papers
  • Synthesizing findings across multiple papers
  • Tracking citation networks and influential papers
  • Researching state-of-the-art methods in AI/ML

Available Tools

Exa MCP Server (Web Search with Academic Filtering)

Tools:

mcp__exa__web_search_exa
,
mcp__exa__get_code_context_exa
,
mcp__exa__deep_search_exa

Key Parameters for Academic Search:

  • category: "research_paper"
    - Filter results to academic papers
  • includeDomains: ["arxiv.org"]
    - Restrict to arXiv
  • startPublishedDate
    /
    endPublishedDate
    - Filter by publication date

ArXiv MCP Server (Paper Search, Download, Analysis)

Tools:

search_papers
,
download_paper
,
list_papers
,
read_paper

Capabilities:

  • Search arXiv by keyword, author, or category
  • Download papers locally (~/.arxiv-papers)
  • Read paper content directly
  • Deep paper analysis with built-in prompts

Core Workflows

Workflow 1: Quick Paper Discovery

Use case: Find papers on a specific topic quickly

Step 1: Use Exa with research_paper category
mcp__exa__web_search_exa({
  query: "transformer attention mechanisms survey",
  category: "research_paper",
  numResults: 10
})

Step 2: Review titles and abstracts
Step 3: Note arXiv IDs for deeper analysis

Workflow 2: ArXiv-Focused Search

Use case: Search specifically within arXiv

Step 1: Use arxiv MCP search_papers
search_papers({
  query: "large language models reasoning",
  max_results: 20,
  sort_by: "relevance"
})

Step 2: Download papers
download_paper({ arxiv_id: "2301.00234" })

Step 3: Read and analyze
read_paper({ arxiv_id: "2301.00234" })

Workflow 3: Comprehensive Literature Review

Step 1: Broad discovery with Exa (category: "research_paper")
Step 2: Identify key papers and authors
Step 3: Deep dive with arXiv MCP (download + read_paper)
Step 4: Synthesize findings by methodology/approach

Workflow 4: Recent Developments Tracking

Step 1: Time-filtered Exa search
mcp__exa__web_search_exa({
  query: "multimodal large language models",
  category: "research_paper",
  startPublishedDate: "2024-01-01"
})

Step 2: Sort arXiv by submitted_date
search_papers({ query: "multimodal LLM", sort_by: "submitted_date" })

ArXiv Categories Reference

CategoryDescription
cs.AIArtificial Intelligence
cs.CLComputation and Language (NLP)
cs.CVComputer Vision
cs.LGMachine Learning
cs.NENeural and Evolutionary Computing
stat.MLStatistics - Machine Learning
cs.RORobotics

Academic Domain Filtering

For Exa searches, restrict to academic sources:

includeDomains: [
  "arxiv.org",
  "aclanthology.org",
  "openreview.net",
  "proceedings.mlr.press",
  "papers.nips.cc",
  "openaccess.thecvf.com"
]

Tool Selection Guide

TaskPrimary ToolAlternative
Broad topic searchExa (research_paper)arXiv search_papers
ArXiv-specificarXiv search_papersExa with includeDomains
Download paperarXiv download_paper-
Full paper contentarXiv read_paper-
Code implementationsExa get_code_context-
Very recent papersarXiv (submitted_date)Exa with date filter

Best Practices

  1. Start broad with Exa's research_paper category, then narrow
  2. Use date filtering for recent developments
  3. Download key papers via arXiv MCP for persistent access
  4. Cross-reference multiple search approaches
  5. Use technical terms in queries for better results

Domain: Subquadratic Attention

Research domain for post-transformer attention mechanisms that break the O(n^2) barrier. Active area with rapid publication cadence (2024–2026).

Key Papers

PaperYearKey Contribution
FlashAttention-2 (Dao)2023IO-aware exact attention — foundation for all subsequent work
DuoAttention2024Split attention heads into retrieval (sparse) vs streaming (full)
Ring Attention2024Distributed sequence parallelism across devices
MoBA (Mixture of Block Attention)2025Block-sparse top-k gating with Triton kernel, 1M tokens
NSA (Native Sparse Attention, DeepSeek)2025Hardware-aligned sparse attention patterns
TokenSelect2025Dynamic per-layer token pruning

Pre-Built Search Queries

# Exa (research_paper category)
"subquadratic attention mechanism" --category "research paper" --after 2024-01-01
"block sparse attention triton kernel" --category "research paper"
"mixture of attention heads sparse" --category "research paper"
"linear attention transformer approximation" --category "research paper" --after 2024-06-01

# ArXiv (cs.LG + cs.CL)
search_papers({ query: "subquadratic attention sparse transformer", max_results: 20, sort_by: "submitted_date" })
search_papers({ query: "block sparse FlashAttention kernel", max_results: 10 })

Evaluation Criteria

When comparing subquadratic attention mechanisms, benchmark on:

CriterionWhat to Measure
QualityPerplexity degradation vs full attention at target sequence length
SpeedWall-clock speedup on consumer GPUs (RTX 4090, M4 Max)
MemoryReduction factor at 128K / 512K / 1M context
CompatibilityDrop-in replacement vs requires retraining
SparsityHow much computation is actually skipped (e.g., 95% at 1M tokens)

Local Implementation Reference

Working MoBA implementation with Triton kernels:

~/Desktop/Aldea/01-Repos/perplexity-clone/model/moba_block_sparse.py


Reference Documentation

For detailed parameters and advanced usage:

  • references/exa-academic-search.md
    - Exa parameters for academic search
  • references/arxiv-mcp-tools.md
    - ArXiv MCP server tool reference