If you use Fastlane for builds, screenshot generation is probably still manual. Here's how to automate it end-to-end.
Traditional Fastlane Approach
Fastlane's snapshot captures screenshots from UI tests:
lane :screenshots do
snapshot(
devices: ["iPhone 16 Pro Max", "iPhone SE"],
languages: ["en-US", "de-DE", "ja"],
clear_previous_screenshots: true
)
end
But these are raw captures — no marketing copy, no backgrounds, no device frames.
The Missing Step
After capture, you still need to: write copy, apply design system, add device mockups, generate all sizes, localize. This is where automation breaks down.
AI-Powered Pipeline
# 1. Capture raw screenshots
fastlane snapshot
# 2. AI generates marketing screenshots
# 3. Upload to App Store Connect
fastlane deliver(
skip_metadata: true,
overwrite_screenshots: true
)
MCP Server Integration
For AI coding assistants (Cursor, Claude, Copilot), AppFrames offers an MCP server:
- Agent reads your README for app description
- Processes raw screenshots from your project
- Generates marketing-ready assets with AI
- Outputs to your Fastlane screenshots folder
- Localizes to all supported languages
CI/CD Example (GitHub Actions)
jobs:
screenshots:
runs-on: macos-latest
steps:
- uses: actions/checkout@v4
- name: Capture raw screenshots
run: fastlane snapshot
- name: Generate marketing screenshots
run: npx appframes generate --input ./raw --output ./marketing
- name: Upload to App Store Connect
run: fastlane deliver
AppFrames works as a web tool today, with MCP access coming soon. Try it free →
Ready to create your screenshots?
Describe your app. AI generates the rest — copy, design, layouts, and localization.
Try AppFrames Free →