Skip to main content
Use Greptile MCP tools to generate reports on code review activity, identify trends, and track team performance.
These workflows are performed through your AI coding assistant (Claude, Cursor, etc.) after setting up MCP.

Repository Health Overview

Ask your AI assistant to generate a health summary:
Give me a health overview of my repositories using Greptile data
Your assistant will use multiple MCP tools to compile:
  • Total PRs reviewed vs unreviewed
  • Number of open security issues
  • Most active repositories by review volume
  • Custom context adoption rates
  • Review completion percentages

PR Review Summary

Get a summary of all reviews for a specific time period:
Summarize all Greptile code reviews from the past week
Example output:
## Weekly Code Review Summary (Nov 18-24)

### Overview
- **Total PRs reviewed:** 47
- **Completed reviews:** 43 (91.5%)
- **Failed/Skipped:** 4

### By Repository
| Repository | PRs Reviewed | Issues Found | Addressed |
|------------|--------------|--------------|-----------|
| myorg/backend | 18 | 42 | 38 (90%) |
| myorg/frontend | 15 | 28 | 25 (89%) |
| myorg/shared-lib | 14 | 15 | 15 (100%) |

### Top Issue Categories
1. **Security** - 12 comments (8 addressed)
2. **Performance** - 18 comments (16 addressed)
3. **Style** - 35 comments (32 addressed)

### Unaddressed Critical Issues
- PR #234: SQL injection vulnerability in user.service.js
- PR #241: Missing auth check in admin routes

Comment Trend Analysis

Analyze patterns in your team’s code review feedback:
Search all Greptile comments for "security" and show me the trend over time
Or for specific patterns:
What are the most common issues Greptile finds in our codebase?
Use this to identify:
  • Recurring issues that need documentation
  • Training opportunities for the team
  • Candidates for new custom context rules

Team Performance Metrics

Generate metrics on how your team responds to reviews:
What's our average time to address Greptile comments? 
Show me by repository.
Key metrics to track:
  • Address rate: Percentage of comments marked as addressed
  • Response time: How quickly comments are resolved
  • Comment engagement: Upvote/downvote ratios (indicates review quality)

Custom Reports

Security Audit Report

Generate a security audit report: list all unaddressed security comments 
across all repositories, grouped by severity

Pre-Release Checklist

For the upcoming release, show me:
1. All open PRs with unaddressed comments
2. Any security or critical issues pending
3. PRs with failed or incomplete reviews

Onboarding Report

Generate an onboarding report showing:
1. All active custom context patterns
2. Most common issues in each repository
3. Recent examples of good and bad patterns

Exporting Data

Ask your assistant to format data for export:
Export all code reviews from the past month as a CSV with columns:
PR number, repository, status, issues found, issues addressed, completion time
Your assistant can format the MCP response data into:
  • CSV for spreadsheets
  • Markdown tables for documentation
  • JSON for further processing

Visualization Ideas

While MCP tools return raw data, you can ask your assistant to create text-based visualizations:
Create an ASCII chart showing review activity by day of week
Or export data to visualization tools:
Format our weekly review data for import into a Grafana dashboard

Automated Reporting

Set up regular reports by creating prompts you can reuse: Daily standup prompt:
Quick summary: How many PRs were reviewed yesterday? 
Any critical issues still open?
Weekly team report prompt:
Generate our weekly code review report with:
- PR count by repository
- Top 5 unaddressed issues
- Comment address rate trend
Monthly retrospective prompt:
Month-over-month comparison:
- Total reviews
- Average issues per PR
- Most improved repository
- Top recurring issues to address

Next Steps