How to Package Your Creative Leadership Knowledge Into AI Skills (And Why it's Worth the Effort)

Reading Time

Reading Time

12 minutes

12 minutes

Hero Image
Hero Image
Hero Image

I've been leading creative teams for over a decade. In that time, I've developed opinions about everything: how to write a brief that actually gets good work, how to apply brand standards without killing creativity, how to talk to executives versus engineers versus sales teams about the same project.

Most of that knowledge just lives in my head. Some of it lives in Google Docs that nobody reads.

A couple months ago, I spent time packaging this into Claude Skills. I'd been working with AI for years at this point (generating video content, building interactive campaigns, that kind of thing). But Skills felt different. Instead of asking AI to create something new, I was teaching it to replicate how I think about process and evaluation.

It's a totally different kind of leverage, and the results were better than I expected. Not so much that it worked (I expected that), but how much of the tedious, time-consuming work it could actually handle. Logo spacing checks that normally eat up 30 minutes apiece. Hunting through scattered Slack threads and email chains to find information for creative briefs. Writing developer handoff specs that take longer to document than the design took to create. All those "creative" tasks that require zero creativity but consume hours of time.

"Skills," if you haven't run into them yet, are basically instruction sets you create once, and Claude loads automatically when it's relevant. You're teaching Claude your specific processes, frameworks, and standards. You explain how you do something in detail, save it as a Skill, and from then on Claude applies that methodology without you needing to re-explain it every single time.

For creative leaders specifically, this solves a problem we don't really talk about enough. A lot of our time gets eaten by work that doesn't actually require our expertise. For all the stuff I just mentioned, Skills let you offload that work so you can focus on what actually requires your decades of experience.

Why This Matters Now

Every creative leader I know is dealing with some version of the same situation: Teams are leaner. Timelines are shorter. Expectations haven't changed (and in some cases have gotten bigger). You're supposed to maintain creative excellence while also moving faster, collaborating across more functions, and doing more with even less money and headcount than you had last year.

AI won't replace your creative judgment. I'm genuinely not interested in that use case. What it can do is handle the checklist work, the information gathering, the documentation generation. The stuff that has to get done but doesn't need a senior creative leader spending their time on it.

The specific advantage of Skills over "basic AI" is that instead of hoping Claude happens to know good creative process on its own, you're actually teaching it your creative process. And the difference is substantial.

What I've Actually Built (And What Worked)

Here are the Skills I've currently built and actually use, along with what they do (instructions on how to build your own similar Skills is further down in the article):

Brand Compliance Checker

This is the one that saves me the most time. Hands down. It automates all the tedious, objective checks that eat hours during design reviews. Especially when you're working with external vendors or contractors who don't know your guidelines that well (or at all, in some cases).

What it checks automatically:

  • Logo size and spacing (minimum dimensions, clear space requirements)

  • Color codes (correct hex/RGB/CMYK values)

  • Font usage (correct typefaces, weights, sizes)

  • Brand element relationships (like when you have partner logos and there are specific rules about how they relate to each other)

  • Technical stuff (file formats, resolution, color space)

What it flags for me to review:

  • Potential tone mismatches ("The headline feels more formal than your typical conversational approach")

  • Communication goal conflicts ("The CTA says 'Learn More' but the body copy is driving toward immediate signup")

  • Strategic concerns ("This focuses heavily on features but the brief emphasized emotional benefits")

  • Hierarchy questions ("The secondary message is getting more visual weight than the primary value proposition")

How I actually work with it:

I upload a design. Claude runs through the objective checks first and flags any violations with specific explanations. Logo width is 42px but minimum spec is 50px. Color is #1A2B3D but brand primary blue is #1A2B3C.

Then it gives me a summary of the subjective stuff. Tone appears more technical and formal than typical brand voice. Primary CTA emphasizes exploration while body copy builds urgency for conversion. Consider whether message strategy and desired action are aligned.

I still review everything myself. But instead of spending 60-90 minutes measuring spacing and checking color codes, I spend maybe 15 minutes confirming Claude caught everything. Then I can spend 30-45 minutes on the actual strategic feedback about hierarchy, messaging and creative execution.

What this replaced:

The absolutely soul-crushing work of marking up vendor designs with 47 comments that all essentially say "follow the brand guidelines correctly." I've spent entire afternoons, sometimes multiple days, doing nothing but measuring pixel dimensions and flagging incorrect fonts. That work still happens, but now AI is doing most of it for me.

More importantly, it prevents guideline violations from slipping through when I'm rushed or focused on bigger strategic issues. When violations slide by, it creates precedent. Suddenly guidelines become "suggestions" and brand consistency starts to erode. This Skill catches things I might miss when I'm moving fast.

Creative Brief Information Gatherer

Writing a good creative brief is part detective work, part synthesis, and part translation. One of the more time-consuming tasks is gathering all the scattered information needed to turn stakeholder input into something creatives can actually work with.

That's where this Skill comes into play. It handles the information gathering and creates a preliminary outline that I can refine from there.

If you were expecting a "Creative Brief Generator," this ain't it, my friends… I choose the words in that subhead you read a second ago very specifically. This is purely for gathering the information that's needed for a real human to write the actual brief. There's some stuff that AI just can't replicate. Use the robots for what they do well and leave the rest to us humans.

What it does:

I point it at the project folder (Slack threads, strategy docs, previous campaign learnings, stakeholder emails… whatever exists, really) and it pulls out things like:

  • Who's the target audience, with all the supporting details from multiple sources

  • What they currently believe (synthesized from research docs and past campaign data)

  • What we want them to believe instead (extracted from strategy documents)

  • Success metrics and KPIs (gathered from project briefs and stakeholder input)

  • Constraints and requirements (timeline, budget, technical limitations, deliverable specs)

  • Links to all the reference materials and supporting documentation

What it flags:

  • Information gaps ("No clear success metric found in source materials")

  • Conflicting objectives ("Strategy doc emphasizes brand awareness but stakeholder emails focus on conversion")

  • Missing context ("No data on past performance of similar campaigns")

How I actually work with it:

Instead of spending hours hunting through 47 Slack messages, 12 Google Docs, and 23 email threads to find all the relevant information, I get a structured summary in about 20 minutes. Claude identifies what information exists, what's missing, and where things conflict and need to be resolved.

I still do the hard work. Translating all that input into "the single most important thing this needs to accomplish." Defining the real problem, not the multi-paragraph, 10-problems-for-the-price-of-one, problem statement by committee. It's judgment work that requires understanding the business context, the stakeholder dynamics, and what actually drives good creative work. But now, I'm not wasting cognitive energy on information scavenger hunts.

The stakeholder approval process still requires me as well. There's explanation, listening, persuasion, compromise and trust-building work. I'm helping stakeholders understand how and why I've translated their 30 bullet points into one concise strategic statement. That's inherently human work. But I'm having those conversations with a much more complete picture of what information exists and where the gaps are.

Inquisitive personalities abhor a vacuum. Without data to complete the gaps, questions are bound to arise. And not the good "what if we did [this] to make it even more strategically sound?" questions, but the "what if we made [the entire page] one big button?" ones. The ones that lead to missed deadlines and endless review cycles and work everyone looks back at and wonders "why?"

What this replaced:

The preliminary grunt work of brief development. The "wait, where did I see that audience insight?" and "which doc had the budget information?" or "didn't someone mention competitive context in that Slack thread three weeks ago?"

That kind of work now takes 30 minutes, instead of 3 hours. I can spend my energy on the actual brief development. The synthesis. The strategic framing. The translation of business objectives into creative direction.

Developer Handoff Documentation Generator

Designers create beautiful work. Then they spend hours writing technical documentation so developers can actually build it. This Skill handles that second part.

What it generates:

  • Component specifications (dimensions, spacing, padding, margins)

  • Interaction patterns (hover states, transitions, animations)

  • Responsive behavior (breakpoints, scaling rules, mobile adaptations)

  • Asset requirements (image formats, sizes, optimization specs)

  • Typography details (font stacks, sizes, line heights, letter spacing)

  • Color values (hex, RGB, CMYK for all brand and custom colors)

  • Accessibility requirements (contrast ratios, alt text guidance, ARIA labels)

How I work with it:

Designer finishes work in Figma. I upload the design file. Claude generates the handoff documentation following our specific format and technical standards. Designer reviews it for accuracy and completeness (maybe 15 minutes). Done.

The documentation is consistent across designers and projects because it follows the same template and includes the same level of detail every time. Developers aren't getting different formats depending on which designer they're working with.

What this replaced:

Designers spending 2 to 3 hours writing specs that are technically accurate but mind-numbingly boring to create. That time now goes to actual design work. Developers get consistent, complete documentation. Everyone's happier.

Production Requirements Synthesizer

This one prevents the "this looks amazing but won't work at 300x250" problem that haunts every other creative review in a modern marketing team today.

What it does:

Takes the full list of deliverable specifications for a campaign (all the banner sizes, social formats, email dimensions, landing page constraints, everything) and creates what I call an "executional brief." A guiding document that shows designers the boundaries before they start concepting.

What it includes:

  • Size constraints (smallest width, largest width, smallest height, largest height across all deliverables)

  • Color space requirements (RGB for digital and/or CMYK for print, any limitations on spot colors)

  • File format specifications (source files, final delivery formats, compression requirements)

  • Copy length limits (character counts for headlines, body copy, CTAs across all the different sizes)

  • Asset reuse opportunities (which creative elements need to work across multiple formats)

  • Technical constraints (animation limitations, file size caps, loading requirements)

How I work with it:

Before creative concepting starts, I run the deliverable specs through this Skill. It identifies the constraints that will impact design decisions. Not just the obvious ones (e.g. needs to work at both 160x600 and 728x90), but the less obvious ones too (e.g. CTAs need to work in both 20 characters and 40 characters across the full set).

Designers review this executional brief before they start concepting. They're designing within the actual constraints from the beginning, not discovering in round three that their beautiful concept doesn't scale to the smallest format.

What this replaced:

The painful "this concept is great but we need to redesign it because it won't work in half the required formats" conversation that happens in round two or three of review. Now those constraints are visible upfront. Designs that reach formal review are designs that can actually be produced across the full scope.

Stakeholder Communication Suite

I created three separate Skills for this because different audiences need fundamentally different information.

Executive Communication Skill: Leads with business impact and strategic implications. Includes metrics and ROI framing. Acknowledges risks and tradeoffs explicitly. Keeps creative process detail minimal unless it's specifically relevant to decisions they need to make.

Cross-Functional Partner Skill (Product, Engineering, Sales): Focuses on dependencies and integration points. Explains creative rationale in functional terms, not aesthetic terms. Anticipates common objections and addresses them before they come up. Includes enough process context that they understand timelines and why things take as long as they do.

Team Communication Skill: Provides creative rationale and strategic context. Explains the "why" behind decisions. Includes tactical next steps and clear ownership. Maintains the nuance that gets lost in executive summaries.

Each Skill includes examples of effective communication for that audience and common mistakes to avoid, like explaining color theory to a CFO (don't do it), or being too high-level with designers who actually need tactical, prescriptive direction.

How I work with it:

I draft one comprehensive update with all the information. Then I ask Claude to adapt it for each audience using the relevant Skill. Each stakeholder gets exactly what they need in the format they expect. I review and refine each version (usually just minor edits), but I'm not starting from scratch three times.

The cognitive load difference is significant. Instead of constantly shifting modes (executive mindset, then technical mindset, then team leadership mindset), I stay in one mode and let the Skills handle the translation.

What this replaced:

Drafting the same update three times manually. Or worse, sending one version to everyone and watching it satisfy nobody because executives want less detail and the team wants more context and cross-functional partners want different context entirely.

How to Actually Build a Skill (Step by Step)

The process of creating a Skill is pretty straightforward, but doing it well takes some thought. Here's how I approach it:

Step 1:

Pick One Repetitive Task That Eats Your Time

Don't try to codify your entire expertise at once. Start with something specific that consumes hours of your time without requiring your creative judgment.

Good candidates:

  • Documentation work that follows a consistent format

  • Information gathering from scattered sources

  • Objective compliance checks against established standards

  • Reformatting or adapting content for different contexts

Bad candidates (at least for your first one):

  • Processes that change constantly based on context

  • Purely subjective creative judgment with no consistent criteria

  • Things that require extensive proprietary information you can't share

  • Work that requires reading room dynamics or managing stakeholder politics

I started with brand compliance checking because I was spending 60 to 90 minutes per vendor review on work that didn't require my expertise. Just adherence to documented standards.

Step 2:

Document Your Actual Process (Not the Idealized Version)

Write down how you actually do this thing. Including the specific checks you run and the criteria you apply.

Be specific. What do you check first? In what order? What are the objective pass/fail criteria versus the subjective judgment calls? What tells you something needs flagging for further review? What does done look like for this task?

Include examples. Real examples from your work (anonymized if necessary). The difference between "check brand compliance" and showing an example of a logo spacing violation with the specific measurement that's wrong, that's the difference between a Skill that works and one that doesn't.

For my brand compliance Skill, I included:

  • Our actual brand guidelines (the technical specifications, not the conceptual stuff)

  • Three examples of common violations with specific measurements showing what's wrong

  • Three examples of subjective issues that need human judgment (tone, hierarchy, strategic alignment)

  • The exact format I want for flagging issues ("Logo width is 42px but minimum spec is 50px")

This took maybe two hours to write. Most of that time was finding good examples and documenting the exact specs.

Step 3:

Structure It As Instructions, Not Documentation

Skills work best when written as clear instructions for how to do something. Not as reference material about a topic.

Less effective: "Our brand maintains consistent spacing around the logo."

More effective: "Check logo clear space. Measure the minimum distance from the logo to any other element in all directions. Minimum clear space equals the height of the 'C' in the logo mark. Flag any violations with specific measurements: 'Logo has 8px clear space on the right side but minimum spec is 12px based on C-height.'"

The structure I use for most Skills:

  1. Purpose and Context (2 to 3 sentences on what this Skill does and when it applies)

  2. Objective Checks (The measurable, pass/fail criteria with specific standards)

  3. Subjective Flags (What to surface for human review and how to describe it)

  4. Output Format (How to structure the findings, violations separate from observations)

  5. Examples (Real cases showing both objective violations and subjective flags)

You don't have to follow this exactly. But separating objective checks from subjective observations has been crucial for me.

Step 4:

Create the Actual Skill File

The technical part is simpler than you'd think. A basic Skill is just a folder containing:

manifest.json (describes what the Skill does and when Claude should use it)
instructions.md (your detailed process documentation)
examples/ (optional, for reference materials, templates, brand guidelines)

For non-technical creative leaders: you can write everything in Markdown (basically just formatted text) and don't need to code anything.

Here's a simplified version of my brand compliance Skill structure:

brand-compliance-skill/
├── manifest.json
├── instructions.md
└── examples/
├── brand-guidelines.pdf
├── violation-example.png
└── subjective-flag-example.png

The manifest.json is minimal:

json
{
"name": "Brand Compliance Checker", "description": "Checks designs against brand guidelines for objective violations and flags subjective concerns", "version": "1.0"
}

The real work is in instructions.md, which contains all the methodology I documented in Step 2.

If you're on a Claude Team or Enterprise plan, you can create and manage Skills through the Console interface without touching any files directly. For Pro plans, you work with the file structure, but it's still just writing in Markdown.

Step 5:

Test With Real Work

Don't just create the Skill and assume it works. Try it on actual projects and see what happens.

I tested my brand compliance Skill by running it on some recent vendor designs where I'd already done manual reviews. Then I compared. Did it catch everything I caught? Did it flag things that weren't actually issues? How useful was the output format?

The first version missed some edge cases (logo on dark backgrounds has different clear space requirements, which I hadn't documented). I added those specifications. The second version was too aggressive about flagging tone concerns. It was flagging every design that wasn't explicitly conversational, even when a more formal tone was appropriate for the context. I refined the criteria for when tone should be flagged.

Pay attention to: Does Claude invoke the Skill when you'd expect it to? Are the objective checks actually catching violations accurately? Are the subjective flags useful or is it flagging things that don't matter? Is the output format easy to act on?

Step 6:

Refine Based on What Actually Gets Used

After a few weeks of using a Skill, patterns emerge about what works and what doesn't.

My developer handoff Skill needed the most refinement. The initial version included too much detail. Specs that developers didn't actually need and that just cluttered the documentation. I stripped it down to what developers actually reference. Now the output is leaner and more useful.

The production requirements Skill needed more detail going the other direction. The initial version identified constraints but didn't explain why they mattered or how they connected. I added context. "Character count limited to 35 for headlines because smallest banner size (300x250) can't accommodate more at minimum readable font size." Now designers understand not just what the constraint is but why it exists.

You'll know a Skill is working when you start trusting its outputs enough that you're reviewing and refining rather than redoing the work yourself to be sure.

I went through four iterations on my brand compliance Skill alone, before it consistently caught what I needed it to catch without over-flagging, and I plan on going through more iterations yet. When it comes to foundational work like this, time-invested has a multiplier effect on cumulative-time-saved.

How to Actually Build a Skill (Step by Step)

The process of creating a Skill is pretty straightforward, but doing it well takes some thought. Here's how I approach it:

Step 1:

Pick One Repetitive Task That Eats Your Time

Don't try to codify your entire expertise at once. Start with something specific that consumes hours of your time without requiring your creative judgment.

Good candidates:

  • Documentation work that follows a consistent format

  • Information gathering from scattered sources

  • Objective compliance checks against established standards

  • Reformatting or adapting content for different contexts

Bad candidates (at least for your first one):

  • Processes that change constantly based on context

  • Purely subjective creative judgment with no consistent criteria

  • Things that require extensive proprietary information you can't share

  • Work that requires reading room dynamics or managing stakeholder politics

I started with brand compliance checking because I was spending 60 to 90 minutes per vendor review on work that didn't require my expertise. Just adherence to documented standards.

Step 2:

Document Your Actual Process (Not the Idealized Version)

Write down how you actually do this thing. Including the specific checks you run and the criteria you apply.

Be specific. What do you check first? In what order? What are the objective pass/fail criteria versus the subjective judgment calls? What tells you something needs flagging for further review? What does done look like for this task?

Include examples. Real examples from your work (anonymized if necessary). The difference between "check brand compliance" and showing an example of a logo spacing violation with the specific measurement that's wrong, that's the difference between a Skill that works and one that doesn't.

For my brand compliance Skill, I included:

  • Our actual brand guidelines (the technical specifications, not the conceptual stuff)

  • Three examples of common violations with specific measurements showing what's wrong

  • Three examples of subjective issues that need human judgment (tone, hierarchy, strategic alignment)

  • The exact format I want for flagging issues ("Logo width is 42px but minimum spec is 50px")

This took maybe two hours to write. Most of that time was finding good examples and documenting the exact specs.

Step 3:

Structure It As Instructions, Not Documentation

Skills work best when written as clear instructions for how to do something. Not as reference material about a topic.

Less effective: "Our brand maintains consistent spacing around the logo."

More effective: "Check logo clear space. Measure the minimum distance from the logo to any other element in all directions. Minimum clear space equals the height of the 'C' in the logo mark. Flag any violations with specific measurements: 'Logo has 8px clear space on the right side but minimum spec is 12px based on C-height.'"

The structure I use for most Skills:

  1. Purpose and Context (2 to 3 sentences on what this Skill does and when it applies)

  2. Objective Checks (The measurable, pass/fail criteria with specific standards)

  3. Subjective Flags (What to surface for human review and how to describe it)

  4. Output Format (How to structure the findings, violations separate from observations)

  5. Examples (Real cases showing both objective violations and subjective flags)

You don't have to follow this exactly. But separating objective checks from subjective observations has been crucial for me.

Step 4:

Create the Actual Skill File

The technical part is simpler than you'd think. A basic Skill is just a folder containing:

manifest.json (describes what the Skill does and when Claude should use it)
instructions.md (your detailed process documentation)
examples/ (optional, for reference materials, templates, brand guidelines)

For non-technical creative leaders: you can write everything in Markdown (basically just formatted text) and don't need to code anything.

Here's a simplified version of my brand compliance Skill structure:

brand-compliance-skill/
├── manifest.json
├── instructions.md
└── examples/
├── brand-guidelines.pdf
├── violation-example.png
└── subjective-flag-example.png

The manifest.json is minimal:json

{ "name": "Brand Compliance Checker", "description": "Checks designs against brand guidelines for objective violations and flags subjective concerns", "version": "1.0" }

The real work is in instructions.md, which contains all the methodology I documented in Step 2.

If you're on a Claude Team or Enterprise plan, you can create and manage Skills through the Console interface without touching any files directly. For Pro plans, you work with the file structure, but it's still just writing in Markdown.

Step 5: Test With Real Work

Don't just create the Skill and assume it works. Try it on actual projects and see what happens.

I tested my brand compliance Skill by running it on some recent vendor designs where I'd already done manual reviews. Then I compared. Did it catch everything I caught? Did it flag things that weren't actually issues? How useful was the output format?

The first version missed some edge cases (logo on dark backgrounds has different clear space requirements, which I hadn't documented). I added those specifications. The second version was too aggressive about flagging tone concerns. It was flagging every design that wasn't explicitly conversational, even when a more formal tone was appropriate for the context. I refined the criteria for when tone should be flagged.

Pay attention to: Does Claude invoke the Skill when you'd expect it to? Are the objective checks actually catching violations accurately? Are the subjective flags useful or is it flagging things that don't matter? Is the output format easy to act on?

I went through four iterations on my brand compliance Skill before it consistently caught what I needed it to catch without over-flagging.

Step 6: Refine Based on What Actually Gets Used

After a few weeks of using a Skill, patterns emerge about what works and what doesn't.

My developer handoff Skill needed the most refinement. The initial version included too much detail. Specs that developers didn't actually need and that just cluttered the documentation. I stripped it down to what developers actually reference. Now the output is leaner and more useful.

The production requirements Skill needed more detail going the other direction. The initial version identified constraints but didn't explain why they mattered or how they connected. I added context. "Character count limited to 35 for headlines because smallest banner size (300x250) can't accommodate more at minimum readable font size." Now designers understand not just what the constraint is but why it exists.

You'll know a Skill is working when you start trusting its outputs enough that you're reviewing and refining rather than redoing the work yourself to be sure.

What I Learned From Actually Doing This

The Tedious Work Takes More Time Than You Realize

Before I tracked it, I didn't realize how much time I spent on objective compliance checks, information gathering, documentation generation. It felt like background work. The stuff you do between the "real work" of providing creative direction and strategic feedback.

Turns out, that background work was consuming 30 to 40 percent of my time. Logo spacing checks during design reviews. Hunting through Slack and email for brief information. Writing developer specs. Reformatting updates for different audiences.

None of that work required my creative judgment or strategic expertise. But it was eating 12 to 15 hours per week.

Now that work takes maybe 3 to 4 hours per week (mostly reviewing what Claude produced rather than doing it from scratch). The 8 to 11 hours I got back doesn't mean I'm working less. It means I'm spending that time on work that actually requires 18 years of experience.

Skills Work Best for Checklist Work and Synthesis

The Skills that save me the most time are the ones handling objective verification (brand compliance, technical specs), information aggregation (gathering scattered documentation), format translation (stakeholder communications, developer handoffs), and constraint mapping (production requirements).

These are all tasks with clear criteria, consistent processes, defined outputs.

The Skills that don't work as well are the ones that try to encode creative judgment. I experimented with a Skill for evaluating concept quality and it was... not useful. It could identify whether a design followed compositional principles, but it couldn't tell me if the concept was strategically sound or creatively distinctive.

Process and checklist work can be systematized. Creative judgment can't.

This Changes Where Your Expertise Actually Adds Value

Once Skills handle the first-pass compliance work and information synthesis, your role shifts.

I used to spend a lot of time on initial reviews. Catching guideline violations. Gathering information. Writing documentation. That work still happens, but now Claude (using my Skills) does the first pass.

I spend my time on second-order questions. Not "is the logo too small" but "does this design advance our positioning or just execute a tactic?" Not "where did I see that audience insight" but "how do we translate this insight into creative direction?" Not "what are the specs for this component" but "is this the right interaction pattern for this experience?"

The work that requires pattern recognition from seeing thousands of projects. The strategic decisions that depend on understanding the business context. The creative leaps that make work distinctive rather than just adequate.

That's where 18 years of experience actually matters. That's where I'm spending my time now.

You Can't Just Brain Dump Your Process

The hardest part of creating Skills isn't the technical setup. It's articulating your process clearly enough that it can be followed systematically.

I thought I could just describe how I check brand compliance and it would work. It didn't. I had to actually think through: What do I measure first and why? What's the threshold for flagging an issue? How do I distinguish between objective violations and subjective concerns? What context does someone need to act on this feedback?

This is valuable work even if you never create a Skill. Making your expertise explicit makes you better at teaching it to your team.

The Skills I use most are the ones where I was clearest about separating objective criteria from subjective judgment. When I tried to blend them, the Skill became inconsistent.

Skills Scale Process Knowledge Better Than Documentation

We've all worked at companies with brand guidelines that sit in a shared folder, unread. Process documentation that's technically available but rarely consulted. Templates that exist but get used inconsistently.

Skills are different because they're automatically invoked when relevant. You don't have to remember to check the brand guidelines. Claude loads the brand compliance Skill when you ask it to review a design. The knowledge gets applied consistently because the system prompts its use.

For creative leaders, this means your standards get maintained even when you're not personally reviewing every piece of work. Guideline violations get caught. Documentation follows consistent formats. Information gathering follows the same process.

That consistency compounds. Fewer revision cycles. Less rework. Higher baseline quality even when you're not in the room.

Some Things This Isn't

Skills handle process, checklists, information synthesis. They don't replace creative judgment, strategic decision making, or the ability to read room dynamics and manage stakeholder relationships.

I work with AI for creative production regularly (Midjourney for concept exploration, Veo3 for video production, custom GPTs for various specialized tasks). But Skills serve a different purpose. They're about automating the tedious work and creating consistent processes, not generating creative output.

They also make junior team members more effective faster because they have access to the same compliance checks, information gathering processes, documentation standards that senior team members use. But those team members still need to develop their own judgment over time. Access to good process doesn't replace experience.

The difference between asking Claude to check brand compliance and asking Claude to check brand compliance using your specific brand standards and flagging criteria is substantial. One gives you generic feedback that may or may not be relevant. The other gives you consistent evaluation against your actual standards.

Is This Worth Your Time?

The setup time per Skill is maybe 2 to 4 hours for something straightforward, longer for complex processes. I've created five Skills over the past month and spent probably 15 to 20 hours total, including iteration and refinement.

The time savings: I'm spending maybe 8 to 11 hours less per week on compliance checks, information gathering, documentation work. That's 32 to 44 hours per month. The ROI was positive after the first two weeks.

But the more significant benefit isn't time savings. It's consistency and focus.

Work gets evaluated against the same standards regardless of whether I personally reviewed it. Documentation follows the same format across designers and projects. Information gathering follows the same process for every brief. That consistency means fewer surprises, less rework, higher baseline quality.

And I'm spending my time on work that actually requires my expertise. The strategic creative direction. The judgment calls that come from seeing thousands of projects. The stakeholder relationships that determine whether good work actually ships.

If you're a creative leader who spends significant time on compliance checks, information gathering, documentation work, or reformatting content for different audiences, this is probably worth exploring.

How to Start

Don't try to build a comprehensive library of Skills immediately. Start with one task that eats your time without requiring your creative judgment.

For me, that was brand compliance checking. For you, it might be developer handoff documentation, production requirement synthesis, information gathering for creative briefs, stakeholder communication formatting, or something else entirely.

Pick one. Spend a few hours documenting how you actually do it. Create the Skill. Use it for a week. Refine it based on what works and what doesn't.

Then pick the next thing.

After five Skills, you'll have automated a meaningful portion of the tedious work that consumes your time without requiring your expertise. That's leverage that's hard to get any other way.

If you're already working with Skills for creative leadership work, I'm curious what you've built. The more we share frameworks, the less we all have to start from scratch.

How to Actually Build a Skill (Step by Step)

The process of creating a Skill is pretty straightforward, but doing it well takes some thought. Here's how I approach it:

Step 1:

Pick One Repetitive Task That Eats Your Time

Don't try to codify your entire expertise at once. Start with something specific that consumes hours of your time without requiring your creative judgment.

Good candidates:

  • Documentation work that follows a consistent format

  • Information gathering from scattered sources

  • Objective compliance checks against established standards

  • Reformatting or adapting content for different contexts

Bad candidates (at least for your first one):

  • Processes that change constantly based on context

  • Purely subjective creative judgment with no consistent criteria

  • Things that require extensive proprietary information you can't share

  • Work that requires reading room dynamics or managing stakeholder politics

I started with brand compliance checking because I was spending 60 to 90 minutes per vendor review on work that didn't require my expertise. Just adherence to documented standards.

Step 2:

Document Your Actual Process (Not the Idealized Version)

Write down how you actually do this thing. Including the specific checks you run and the criteria you apply.

Be specific. What do you check first? In what order? What are the objective pass/fail criteria versus the subjective judgment calls? What tells you something needs flagging for further review? What does done look like for this task?

Include examples. Real examples from your work (anonymized if necessary). The difference between "check brand compliance" and showing an example of a logo spacing violation with the specific measurement that's wrong, that's the difference between a Skill that works and one that doesn't.

For my brand compliance Skill, I included:

  • Our actual brand guidelines (the technical specifications, not the conceptual stuff)

  • Three examples of common violations with specific measurements showing what's wrong

  • Three examples of subjective issues that need human judgment (tone, hierarchy, strategic alignment)

  • The exact format I want for flagging issues ("Logo width is 42px but minimum spec is 50px")

This took maybe two hours to write. Most of that time was finding good examples and documenting the exact specs.

Step 3:

Structure It As Instructions, Not Documentation

Skills work best when written as clear instructions for how to do something. Not as reference material about a topic.

Less effective: "Our brand maintains consistent spacing around the logo."

More effective: "Check logo clear space. Measure the minimum distance from the logo to any other element in all directions. Minimum clear space equals the height of the 'C' in the logo mark. Flag any violations with specific measurements: 'Logo has 8px clear space on the right side but minimum spec is 12px based on C-height.'"

The structure I use for most Skills:

  1. Purpose and Context (2 to 3 sentences on what this Skill does and when it applies)

  2. Objective Checks (The measurable, pass/fail criteria with specific standards)

  3. Subjective Flags (What to surface for human review and how to describe it)

  4. Output Format (How to structure the findings, violations separate from observations)

  5. Examples (Real cases showing both objective violations and subjective flags)

You don't have to follow this exactly. But separating objective checks from subjective observations has been crucial for me.

Step 4:

Create the Actual Skill File

The technical part is simpler than you'd think. A basic Skill is just a folder containing:

manifest.json (describes what the Skill does and when Claude should use it)
instructions.md (your detailed process documentation)
examples/ (optional, for reference materials, templates, brand guidelines)

For non-technical creative leaders: you can write everything in Markdown (basically just formatted text) and don't need to code anything.

Here's a simplified version of my brand compliance Skill structure:

brand-compliance-skill/
├── manifest.json
├── instructions.md
└── examples/
├── brand-guidelines.pdf
├── violation-example.png
└── subjective-flag-example.png

The manifest.json is minimal:json

{ "name": "Brand Compliance Checker", "description": "Checks designs against brand guidelines for objective violations and flags subjective concerns", "version": "1.0" }

The real work is in instructions.md, which contains all the methodology I documented in Step 2.

If you're on a Claude Team or Enterprise plan, you can create and manage Skills through the Console interface without touching any files directly. For Pro plans, you work with the file structure, but it's still just writing in Markdown.

Step 5: Test With Real Work

Don't just create the Skill and assume it works. Try it on actual projects and see what happens.

I tested my brand compliance Skill by running it on some recent vendor designs where I'd already done manual reviews. Then I compared. Did it catch everything I caught? Did it flag things that weren't actually issues? How useful was the output format?

The first version missed some edge cases (logo on dark backgrounds has different clear space requirements, which I hadn't documented). I added those specifications. The second version was too aggressive about flagging tone concerns. It was flagging every design that wasn't explicitly conversational, even when a more formal tone was appropriate for the context. I refined the criteria for when tone should be flagged.

Pay attention to: Does Claude invoke the Skill when you'd expect it to? Are the objective checks actually catching violations accurately? Are the subjective flags useful or is it flagging things that don't matter? Is the output format easy to act on?

I went through four iterations on my brand compliance Skill before it consistently caught what I needed it to catch without over-flagging.

Step 6: Refine Based on What Actually Gets Used

After a few weeks of using a Skill, patterns emerge about what works and what doesn't.

My developer handoff Skill needed the most refinement. The initial version included too much detail. Specs that developers didn't actually need and that just cluttered the documentation. I stripped it down to what developers actually reference. Now the output is leaner and more useful.

The production requirements Skill needed more detail going the other direction. The initial version identified constraints but didn't explain why they mattered or how they connected. I added context. "Character count limited to 35 for headlines because smallest banner size (300x250) can't accommodate more at minimum readable font size." Now designers understand not just what the constraint is but why it exists.

You'll know a Skill is working when you start trusting its outputs enough that you're reviewing and refining rather than redoing the work yourself to be sure.

What I Learned From Actually Doing This

The Tedious Work Takes More Time Than You Realize

Before I tracked it, I didn't realize how much time I spent on objective compliance checks, information gathering, documentation generation. It felt like background work. The stuff you do between the "real work" of providing creative direction and strategic feedback.

Turns out, that background work was consuming 30 to 40 percent of my time. Logo spacing checks during design reviews. Hunting through Slack and email for brief information. Writing developer specs. Reformatting updates for different audiences.

None of that work required my creative judgment or strategic expertise. But it was eating 12 to 15 hours per week.

Now that work takes maybe 3 to 4 hours per week (mostly reviewing what Claude produced rather than doing it from scratch). The 8 to 11 hours I got back doesn't mean I'm working less. It means I'm spending that time on work that actually requires 18 years of experience.

Skills Work Best for Checklist Work and Synthesis

The Skills that save me the most time are the ones handling objective verification (brand compliance, technical specs), information aggregation (gathering scattered documentation), format translation (stakeholder communications, developer handoffs), and constraint mapping (production requirements).

These are all tasks with clear criteria, consistent processes, defined outputs.

The Skills that don't work as well are the ones that try to encode creative judgment. I experimented with a Skill for evaluating concept quality and it was... not useful. It could identify whether a design followed compositional principles, but it couldn't tell me if the concept was strategically sound or creatively distinctive.

Process and checklist work can be systematized. Creative judgment can't.

This Changes Where Your Expertise Actually Adds Value

Once Skills handle the first-pass compliance work and information synthesis, your role shifts.

I used to spend a lot of time on initial reviews. Catching guideline violations. Gathering information. Writing documentation. That work still happens, but now Claude (using my Skills) does the first pass.

I spend my time on second-order questions. Not "is the logo too small" but "does this design advance our positioning or just execute a tactic?" Not "where did I see that audience insight" but "how do we translate this insight into creative direction?" Not "what are the specs for this component" but "is this the right interaction pattern for this experience?"

The work that requires pattern recognition from seeing thousands of projects. The strategic decisions that depend on understanding the business context. The creative leaps that make work distinctive rather than just adequate.

That's where 18 years of experience actually matters. That's where I'm spending my time now.

You Can't Just Brain Dump Your Process

The hardest part of creating Skills isn't the technical setup. It's articulating your process clearly enough that it can be followed systematically.

I thought I could just describe how I check brand compliance and it would work. It didn't. I had to actually think through: What do I measure first and why? What's the threshold for flagging an issue? How do I distinguish between objective violations and subjective concerns? What context does someone need to act on this feedback?

This is valuable work even if you never create a Skill. Making your expertise explicit makes you better at teaching it to your team.

The Skills I use most are the ones where I was clearest about separating objective criteria from subjective judgment. When I tried to blend them, the Skill became inconsistent.

Skills Scale Process Knowledge Better Than Documentation

We've all worked at companies with brand guidelines that sit in a shared folder, unread. Process documentation that's technically available but rarely consulted. Templates that exist but get used inconsistently.

Skills are different because they're automatically invoked when relevant. You don't have to remember to check the brand guidelines. Claude loads the brand compliance Skill when you ask it to review a design. The knowledge gets applied consistently because the system prompts its use.

For creative leaders, this means your standards get maintained even when you're not personally reviewing every piece of work. Guideline violations get caught. Documentation follows consistent formats. Information gathering follows the same process.

That consistency compounds. Fewer revision cycles. Less rework. Higher baseline quality even when you're not in the room.

Some Things This Isn't

Skills handle process, checklists, information synthesis. They don't replace creative judgment, strategic decision making, or the ability to read room dynamics and manage stakeholder relationships.

I work with AI for creative production regularly (Midjourney for concept exploration, Veo3 for video production, custom GPTs for various specialized tasks). But Skills serve a different purpose. They're about automating the tedious work and creating consistent processes, not generating creative output.

They also make junior team members more effective faster because they have access to the same compliance checks, information gathering processes, documentation standards that senior team members use. But those team members still need to develop their own judgment over time. Access to good process doesn't replace experience.

The difference between asking Claude to check brand compliance and asking Claude to check brand compliance using your specific brand standards and flagging criteria is substantial. One gives you generic feedback that may or may not be relevant. The other gives you consistent evaluation against your actual standards.

Is This Worth Your Time?

The setup time per Skill is maybe 2 to 4 hours for something straightforward, longer for complex processes. I've created five Skills over the past month and spent probably 15 to 20 hours total, including iteration and refinement.

The time savings: I'm spending maybe 8 to 11 hours less per week on compliance checks, information gathering, documentation work. That's 32 to 44 hours per month. The ROI was positive after the first two weeks.

But the more significant benefit isn't time savings. It's consistency and focus.

Work gets evaluated against the same standards regardless of whether I personally reviewed it. Documentation follows the same format across designers and projects. Information gathering follows the same process for every brief. That consistency means fewer surprises, less rework, higher baseline quality.

And I'm spending my time on work that actually requires my expertise. The strategic creative direction. The judgment calls that come from seeing thousands of projects. The stakeholder relationships that determine whether good work actually ships.

If you're a creative leader who spends significant time on compliance checks, information gathering, documentation work, or reformatting content for different audiences, this is probably worth exploring.

How to Start

Don't try to build a comprehensive library of Skills immediately. Start with one task that eats your time without requiring your creative judgment.

For me, that was brand compliance checking. For you, it might be developer handoff documentation, production requirement synthesis, information gathering for creative briefs, stakeholder communication formatting, or something else entirely.

Pick one. Spend a few hours documenting how you actually do it. Create the Skill. Use it for a week. Refine it based on what works and what doesn't.

Then pick the next thing.

After five Skills, you'll have automated a meaningful portion of the tedious work that consumes your time without requiring your expertise. That's leverage that's hard to get any other way.

If you're already working with Skills for creative leadership work, I'm curious what you've built. The more we share frameworks, the less we all have to start from scratch.

What I Learned From Actually Doing This

The Tedious Work Takes More Time Than You Realize

Before I tracked it, I didn't realize how much time I spent on objective compliance checks, information gathering, documentation generation, etc. It felt like background work. The stuff you do between the "real work" of providing creative direction and strategic feedback.

Turns out, that background work was consuming 30-40% of my time. But now, that work takes maybe 3-4 hours per week (mostly reviewing what Claude produced rather than doing it from scratch). The 8-11 hours I got back doesn't mean I'm working less. It means I'm spending that time on work that actually requires (years and years of) experience.

Skills Work Best for Checklist Work and Synthesis

The Skills that save me the most time are the ones handling objective verification (brand compliance, technical specs), information aggregation (gathering scattered documentation), format translation (stakeholder communications, developer handoffs), and constraint mapping (production requirements).

These are all tasks with clear criteria, consistent processes, defined outputs.

The Skills that don't work as well are the ones that try to encode creative judgment. I experimented with a Skill for evaluating concept quality and it was... not useful, to say the least. It could identify whether a design followed compositional principles, but it couldn't tell me if the concept was strategically sound or creatively distinctive.

Process and checklist work can be systematized. Creative judgment can't.

This Changes Where Your Expertise Actually Adds Value

Once Skills handle the first-pass compliance work and information synthesis, your role shifts.

I used to spend a lot of time on initial reviews. Catching guideline violations. Gathering information. Writing documentation. That work still happens, but now Claude (using my Skills) does the first pass.

I spend my time on second-order questions. Not "Is the logo too small?" or "Where did I see that audience insight…" but "How do we translate this insight into creative direction? Does this design advance our positioning?"

The work that requires pattern recognition from seeing thousands of projects. The strategic decisions that depend on understanding the business context. The creative leaps that make work distinctive rather than just adequate. That's where I'm spending my time now.

You Can't Just Brain Dump Your Process

The hardest part of creating Skills isn't the technical setup. It's articulating your process clearly enough that it can be followed systematically.

There's a fantastic little article about how difficult this is in practice… the author uses a peanut and butter jelly sandwich experiment as proof… if you have 3 minutes, I highly suggest you give it a read.

I thought I could just describe how I check brand compliance and it would work. It didn't. I had to actually think through: What do I measure first and why? What's the threshold for flagging an issue? How do I distinguish between objective violations and subjective concerns? What context does someone need to act on this feedback?

This is valuable work even if you never create a Skill. Making your expertise explicit makes you better at teaching it to your team. It challenges you on "Is this just gut instinct, or transferrable information? Am I making these decisions based on what I can prove I know, or just what I assume to be true?"

The Skills I use most are the ones where I was clearest about separating objective criteria from subjective judgment. When I tried to blend them, the Skill became inconsistent.

Skills Scale Process Knowledge Better Than Documentation

We've all worked at companies with brand guidelines that sit in a shared folder, unread. Process documentation that's technically available but rarely consulted, and templates that exist but get used inconsistently.

Skills are different because they're automatically invoked when relevant. You don't have to remember to check the brand guidelines. Claude loads the brand compliance Skill when you ask it to review a design. The knowledge gets applied consistently because the system prompts its use.

For creative leaders, this means your standards get maintained even when you're not personally reviewing every piece of work. Guideline violations get caught. Documentation follows consistent formats, and that consistency compounds. Fewer revision cycles. Less rework. Higher baseline quality even when you're not in the room.

Some Things This Isn't

Skills handle process, checklists, information synthesis. They don't replace creative judgment, strategic decision making, or the ability to read room dynamics and manage stakeholder relationships.

I work with AI for creative production regularly (Midjourney, Mixboard, Firefly, Figma Make and others for concept exploration, Veo3 and Runway for video production, custom GPTs for various specialized tasks). But Skills serve a different purpose. They're about automating the tedious work and creating consistent processes, not generating creative output.

They also make junior team members more effective faster because they have access to the same compliance checks, information gathering processes and documentation standards that senior team members use. But those team members still need to develop their own judgment over time; access to good process doesn't replace experience.

The difference between asking Claude to check brand compliance and asking Claude to check brand compliance using your specific brand standards and flagging criteria is substantial. One gives you generic feedback that may or may not be relevant. The other gives you consistent evaluation against your actual standards.

Is It Worth Your Time?

The setup time per Skill is maybe 2-4 hours for something straightforward, longer for complex processes. I've created five Skills over the past month and spent probably 15-20 hours total, including iteration and refinement.

The time savings: I'm spending maybe 8 -11 hours less per week on all the busy work (compliance checks, information gathering, documentation, etc.); that's somewhere in the range of 32-44 hours per month that I've saved. But the more significant benefit isn't time savings, it's consistency and focus.

Work gets evaluated against the same standards regardless of whether I personally reviewed it. Documentation follows the same format across designers and projects. Information gathering follows the same process for every brief. That consistency means fewer surprises, less rework, higher baseline quality.

And I'm spending my time on work that actually requires my expertise. The strategic creative direction. The judgment calls that come from seeing thousands of projects. The stakeholder relationships that determine whether good work actually ships.

If you're a creative leader who spends significant time on compliance checks, information gathering, documentation work, or reformatting content for different audiences, this is probably worth exploring.

How to Start

Don't try to build a comprehensive library of Skills immediately. Start with one task that eats your time without requiring your creative judgment.

For me, that was brand compliance checking. For you, it might be developer handoff documentation, production requirement synthesis, information gathering for creative briefs, stakeholder communication formatting, or something else entirely.

Pick one. Spend a few hours documenting how you actually do it. Create the Skill. Use it for a week. Refine it based on what works and what doesn't.

Then pick the next thing.

After a few Skills, you'll have automated a meaningful portion of the tedious work that consumes your time without requiring your expertise. That's leverage that's hard to get any other way.

Also, if you're already working with Skills for creative leadership work, I'm curious to hear what you've built. Drop me an email, or message on LI. The more we share frameworks, the less we all have to start from scratch.

How to Actually Build a Skill (Step by Step)

The process of creating a Skill is pretty straightforward, but doing it well takes some thought. Here's how I approach it:

Step 1:

Pick One Repetitive Task That Eats Your Time

Don't try to codify your entire expertise at once. Start with something specific that consumes hours of your time without requiring your creative judgment.

Good candidates:

  • Documentation work that follows a consistent format

  • Information gathering from scattered sources

  • Objective compliance checks against established standards

  • Reformatting or adapting content for different contexts

Bad candidates (at least for your first one):

  • Processes that change constantly based on context

  • Purely subjective creative judgment with no consistent criteria

  • Things that require extensive proprietary information you can't share

  • Work that requires reading room dynamics or managing stakeholder politics

I started with brand compliance checking because I was spending 60 to 90 minutes per vendor review on work that didn't require my expertise. Just adherence to documented standards.

Step 2:

Document Your Actual Process (Not the Idealized Version)

Write down how you actually do this thing. Including the specific checks you run and the criteria you apply.

Be specific. What do you check first? In what order? What are the objective pass/fail criteria versus the subjective judgment calls? What tells you something needs flagging for further review? What does done look like for this task?

Include examples. Real examples from your work (anonymized if necessary). The difference between "check brand compliance" and showing an example of a logo spacing violation with the specific measurement that's wrong, that's the difference between a Skill that works and one that doesn't.

For my brand compliance Skill, I included:

  • Our actual brand guidelines (the technical specifications, not the conceptual stuff)

  • Three examples of common violations with specific measurements showing what's wrong

  • Three examples of subjective issues that need human judgment (tone, hierarchy, strategic alignment)

  • The exact format I want for flagging issues ("Logo width is 42px but minimum spec is 50px")

This took maybe two hours to write. Most of that time was finding good examples and documenting the exact specs.

Step 3:

Structure It As Instructions, Not Documentation

Skills work best when written as clear instructions for how to do something. Not as reference material about a topic.

Less effective: "Our brand maintains consistent spacing around the logo."

More effective: "Check logo clear space. Measure the minimum distance from the logo to any other element in all directions. Minimum clear space equals the height of the 'C' in the logo mark. Flag any violations with specific measurements: 'Logo has 8px clear space on the right side but minimum spec is 12px based on C-height.'"

The structure I use for most Skills:

  1. Purpose and Context (2 to 3 sentences on what this Skill does and when it applies)

  2. Objective Checks (The measurable, pass/fail criteria with specific standards)

  3. Subjective Flags (What to surface for human review and how to describe it)

  4. Output Format (How to structure the findings, violations separate from observations)

  5. Examples (Real cases showing both objective violations and subjective flags)

You don't have to follow this exactly. But separating objective checks from subjective observations has been crucial for me.

Step 4:

Create the Actual Skill File

The technical part is simpler than you'd think. A basic Skill is just a folder containing:

manifest.json (describes what the Skill does and when Claude should use it)
instructions.md (your detailed process documentation)
examples/ (optional, for reference materials, templates, brand guidelines)

For non-technical creative leaders: you can write everything in Markdown (basically just formatted text) and don't need to code anything.

Here's a simplified version of my brand compliance Skill structure:

brand-compliance-skill/
├── manifest.json
├── instructions.md
└── examples/
├── brand-guidelines.pdf
├── violation-example.png
└── subjective-flag-example.png

The manifest.json is minimal:json

{ "name": "Brand Compliance Checker", "description": "Checks designs against brand guidelines for objective violations and flags subjective concerns", "version": "1.0" }

The real work is in instructions.md, which contains all the methodology I documented in Step 2.

If you're on a Claude Team or Enterprise plan, you can create and manage Skills through the Console interface without touching any files directly. For Pro plans, you work with the file structure, but it's still just writing in Markdown.

Step 5: Test With Real Work

Don't just create the Skill and assume it works. Try it on actual projects and see what happens.

I tested my brand compliance Skill by running it on some recent vendor designs where I'd already done manual reviews. Then I compared. Did it catch everything I caught? Did it flag things that weren't actually issues? How useful was the output format?

The first version missed some edge cases (logo on dark backgrounds has different clear space requirements, which I hadn't documented). I added those specifications. The second version was too aggressive about flagging tone concerns. It was flagging every design that wasn't explicitly conversational, even when a more formal tone was appropriate for the context. I refined the criteria for when tone should be flagged.

Pay attention to: Does Claude invoke the Skill when you'd expect it to? Are the objective checks actually catching violations accurately? Are the subjective flags useful or is it flagging things that don't matter? Is the output format easy to act on?

I went through four iterations on my brand compliance Skill before it consistently caught what I needed it to catch without over-flagging.

Step 6: Refine Based on What Actually Gets Used

After a few weeks of using a Skill, patterns emerge about what works and what doesn't.

My developer handoff Skill needed the most refinement. The initial version included too much detail. Specs that developers didn't actually need and that just cluttered the documentation. I stripped it down to what developers actually reference. Now the output is leaner and more useful.

The production requirements Skill needed more detail going the other direction. The initial version identified constraints but didn't explain why they mattered or how they connected. I added context. "Character count limited to 35 for headlines because smallest banner size (300x250) can't accommodate more at minimum readable font size." Now designers understand not just what the constraint is but why it exists.

You'll know a Skill is working when you start trusting its outputs enough that you're reviewing and refining rather than redoing the work yourself to be sure.

What I Learned From Actually Doing This

The Tedious Work Takes More Time Than You Realize

Before I tracked it, I didn't realize how much time I spent on objective compliance checks, information gathering, documentation generation. It felt like background work. The stuff you do between the "real work" of providing creative direction and strategic feedback.

Turns out, that background work was consuming 30 to 40 percent of my time. Logo spacing checks during design reviews. Hunting through Slack and email for brief information. Writing developer specs. Reformatting updates for different audiences.

None of that work required my creative judgment or strategic expertise. But it was eating 12 to 15 hours per week.

Now that work takes maybe 3 to 4 hours per week (mostly reviewing what Claude produced rather than doing it from scratch). The 8 to 11 hours I got back doesn't mean I'm working less. It means I'm spending that time on work that actually requires 18 years of experience.

Skills Work Best for Checklist Work and Synthesis

The Skills that save me the most time are the ones handling objective verification (brand compliance, technical specs), information aggregation (gathering scattered documentation), format translation (stakeholder communications, developer handoffs), and constraint mapping (production requirements).

These are all tasks with clear criteria, consistent processes, defined outputs.

The Skills that don't work as well are the ones that try to encode creative judgment. I experimented with a Skill for evaluating concept quality and it was... not useful. It could identify whether a design followed compositional principles, but it couldn't tell me if the concept was strategically sound or creatively distinctive.

Process and checklist work can be systematized. Creative judgment can't.

This Changes Where Your Expertise Actually Adds Value

Once Skills handle the first-pass compliance work and information synthesis, your role shifts.

I used to spend a lot of time on initial reviews. Catching guideline violations. Gathering information. Writing documentation. That work still happens, but now Claude (using my Skills) does the first pass.

I spend my time on second-order questions. Not "is the logo too small" but "does this design advance our positioning or just execute a tactic?" Not "where did I see that audience insight" but "how do we translate this insight into creative direction?" Not "what are the specs for this component" but "is this the right interaction pattern for this experience?"

The work that requires pattern recognition from seeing thousands of projects. The strategic decisions that depend on understanding the business context. The creative leaps that make work distinctive rather than just adequate.

That's where 18 years of experience actually matters. That's where I'm spending my time now.

You Can't Just Brain Dump Your Process

The hardest part of creating Skills isn't the technical setup. It's articulating your process clearly enough that it can be followed systematically.

I thought I could just describe how I check brand compliance and it would work. It didn't. I had to actually think through: What do I measure first and why? What's the threshold for flagging an issue? How do I distinguish between objective violations and subjective concerns? What context does someone need to act on this feedback?

This is valuable work even if you never create a Skill. Making your expertise explicit makes you better at teaching it to your team.

The Skills I use most are the ones where I was clearest about separating objective criteria from subjective judgment. When I tried to blend them, the Skill became inconsistent.

Skills Scale Process Knowledge Better Than Documentation

We've all worked at companies with brand guidelines that sit in a shared folder, unread. Process documentation that's technically available but rarely consulted. Templates that exist but get used inconsistently.

Skills are different because they're automatically invoked when relevant. You don't have to remember to check the brand guidelines. Claude loads the brand compliance Skill when you ask it to review a design. The knowledge gets applied consistently because the system prompts its use.

For creative leaders, this means your standards get maintained even when you're not personally reviewing every piece of work. Guideline violations get caught. Documentation follows consistent formats. Information gathering follows the same process.

That consistency compounds. Fewer revision cycles. Less rework. Higher baseline quality even when you're not in the room.

Some Things This Isn't

Skills handle process, checklists, information synthesis. They don't replace creative judgment, strategic decision making, or the ability to read room dynamics and manage stakeholder relationships.

I work with AI for creative production regularly (Midjourney for concept exploration, Veo3 for video production, custom GPTs for various specialized tasks). But Skills serve a different purpose. They're about automating the tedious work and creating consistent processes, not generating creative output.

They also make junior team members more effective faster because they have access to the same compliance checks, information gathering processes, documentation standards that senior team members use. But those team members still need to develop their own judgment over time. Access to good process doesn't replace experience.

The difference between asking Claude to check brand compliance and asking Claude to check brand compliance using your specific brand standards and flagging criteria is substantial. One gives you generic feedback that may or may not be relevant. The other gives you consistent evaluation against your actual standards.

Is This Worth Your Time?

The setup time per Skill is maybe 2 to 4 hours for something straightforward, longer for complex processes. I've created five Skills over the past month and spent probably 15 to 20 hours total, including iteration and refinement.

The time savings: I'm spending maybe 8 to 11 hours less per week on compliance checks, information gathering, documentation work. That's 32 to 44 hours per month. The ROI was positive after the first two weeks.

But the more significant benefit isn't time savings. It's consistency and focus.

Work gets evaluated against the same standards regardless of whether I personally reviewed it. Documentation follows the same format across designers and projects. Information gathering follows the same process for every brief. That consistency means fewer surprises, less rework, higher baseline quality.

And I'm spending my time on work that actually requires my expertise. The strategic creative direction. The judgment calls that come from seeing thousands of projects. The stakeholder relationships that determine whether good work actually ships.

If you're a creative leader who spends significant time on compliance checks, information gathering, documentation work, or reformatting content for different audiences, this is probably worth exploring.

How to Start

Don't try to build a comprehensive library of Skills immediately. Start with one task that eats your time without requiring your creative judgment.

For me, that was brand compliance checking. For you, it might be developer handoff documentation, production requirement synthesis, information gathering for creative briefs, stakeholder communication formatting, or something else entirely.

Pick one. Spend a few hours documenting how you actually do it. Create the Skill. Use it for a week. Refine it based on what works and what doesn't.

Then pick the next thing.

After five Skills, you'll have automated a meaningful portion of the tedious work that consumes your time without requiring your expertise. That's leverage that's hard to get any other way.

If you're already working with Skills for creative leadership work, I'm curious what you've built. The more we share frameworks, the less we all have to start from scratch.

How to Actually Build a Skill (Step by Step)

The process of creating a Skill is pretty straightforward, but doing it well takes some thought. Here's how I approach it:

Step 1:

Pick One Repetitive Task That Eats Your Time

Don't try to codify your entire expertise at once. Start with something specific that consumes hours of your time without requiring your creative judgment.

Good candidates:

  • Documentation work that follows a consistent format

  • Information gathering from scattered sources

  • Objective compliance checks against established standards

  • Reformatting or adapting content for different contexts

Bad candidates (at least for your first one):

  • Processes that change constantly based on context

  • Purely subjective creative judgment with no consistent criteria

  • Things that require extensive proprietary information you can't share

  • Work that requires reading room dynamics or managing stakeholder politics

I started with brand compliance checking because I was spending 60 to 90 minutes per vendor review on work that didn't require my expertise. Just adherence to documented standards.

Step 2:

Document Your Actual Process (Not the Idealized Version)

Write down how you actually do this thing. Including the specific checks you run and the criteria you apply.

Be specific. What do you check first? In what order? What are the objective pass/fail criteria versus the subjective judgment calls? What tells you something needs flagging for further review? What does done look like for this task?

Include examples. Real examples from your work (anonymized if necessary). The difference between "check brand compliance" and showing an example of a logo spacing violation with the specific measurement that's wrong, that's the difference between a Skill that works and one that doesn't.

For my brand compliance Skill, I included:

  • Our actual brand guidelines (the technical specifications, not the conceptual stuff)

  • Three examples of common violations with specific measurements showing what's wrong

  • Three examples of subjective issues that need human judgment (tone, hierarchy, strategic alignment)

  • The exact format I want for flagging issues ("Logo width is 42px but minimum spec is 50px")

This took maybe two hours to write. Most of that time was finding good examples and documenting the exact specs.

Step 3:

Structure It As Instructions, Not Documentation

Skills work best when written as clear instructions for how to do something. Not as reference material about a topic.

Less effective: "Our brand maintains consistent spacing around the logo."

More effective: "Check logo clear space. Measure the minimum distance from the logo to any other element in all directions. Minimum clear space equals the height of the 'C' in the logo mark. Flag any violations with specific measurements: 'Logo has 8px clear space on the right side but minimum spec is 12px based on C-height.'"

The structure I use for most Skills:

  1. Purpose and Context (2 to 3 sentences on what this Skill does and when it applies)

  2. Objective Checks (The measurable, pass/fail criteria with specific standards)

  3. Subjective Flags (What to surface for human review and how to describe it)

  4. Output Format (How to structure the findings, violations separate from observations)

  5. Examples (Real cases showing both objective violations and subjective flags)

You don't have to follow this exactly. But separating objective checks from subjective observations has been crucial for me.

Step 4:

Create the Actual Skill File

The technical part is simpler than you'd think. A basic Skill is just a folder containing:

manifest.json (describes what the Skill does and when Claude should use it)
instructions.md (your detailed process documentation)
examples/ (optional, for reference materials, templates, brand guidelines)

For non-technical creative leaders: you can write everything in Markdown (basically just formatted text) and don't need to code anything.

Here's a simplified version of my brand compliance Skill structure:

brand-compliance-skill/
├── manifest.json
├── instructions.md
└── examples/
├── brand-guidelines.pdf
├── violation-example.png
└── subjective-flag-example.png

The manifest.json is minimal:json

{ "name": "Brand Compliance Checker", "description": "Checks designs against brand guidelines for objective violations and flags subjective concerns", "version": "1.0" }

The real work is in instructions.md, which contains all the methodology I documented in Step 2.

If you're on a Claude Team or Enterprise plan, you can create and manage Skills through the Console interface without touching any files directly. For Pro plans, you work with the file structure, but it's still just writing in Markdown.

Step 5: Test With Real Work

Don't just create the Skill and assume it works. Try it on actual projects and see what happens.

I tested my brand compliance Skill by running it on some recent vendor designs where I'd already done manual reviews. Then I compared. Did it catch everything I caught? Did it flag things that weren't actually issues? How useful was the output format?

The first version missed some edge cases (logo on dark backgrounds has different clear space requirements, which I hadn't documented). I added those specifications. The second version was too aggressive about flagging tone concerns. It was flagging every design that wasn't explicitly conversational, even when a more formal tone was appropriate for the context. I refined the criteria for when tone should be flagged.

Pay attention to: Does Claude invoke the Skill when you'd expect it to? Are the objective checks actually catching violations accurately? Are the subjective flags useful or is it flagging things that don't matter? Is the output format easy to act on?

I went through four iterations on my brand compliance Skill before it consistently caught what I needed it to catch without over-flagging.

Step 6: Refine Based on What Actually Gets Used

After a few weeks of using a Skill, patterns emerge about what works and what doesn't.

My developer handoff Skill needed the most refinement. The initial version included too much detail. Specs that developers didn't actually need and that just cluttered the documentation. I stripped it down to what developers actually reference. Now the output is leaner and more useful.

The production requirements Skill needed more detail going the other direction. The initial version identified constraints but didn't explain why they mattered or how they connected. I added context. "Character count limited to 35 for headlines because smallest banner size (300x250) can't accommodate more at minimum readable font size." Now designers understand not just what the constraint is but why it exists.

You'll know a Skill is working when you start trusting its outputs enough that you're reviewing and refining rather than redoing the work yourself to be sure.

What I Learned From Actually Doing This

The Tedious Work Takes More Time Than You Realize

Before I tracked it, I didn't realize how much time I spent on objective compliance checks, information gathering, documentation generation. It felt like background work. The stuff you do between the "real work" of providing creative direction and strategic feedback.

Turns out, that background work was consuming 30 to 40 percent of my time. Logo spacing checks during design reviews. Hunting through Slack and email for brief information. Writing developer specs. Reformatting updates for different audiences.

None of that work required my creative judgment or strategic expertise. But it was eating 12 to 15 hours per week.

Now that work takes maybe 3 to 4 hours per week (mostly reviewing what Claude produced rather than doing it from scratch). The 8 to 11 hours I got back doesn't mean I'm working less. It means I'm spending that time on work that actually requires 18 years of experience.

Skills Work Best for Checklist Work and Synthesis

The Skills that save me the most time are the ones handling objective verification (brand compliance, technical specs), information aggregation (gathering scattered documentation), format translation (stakeholder communications, developer handoffs), and constraint mapping (production requirements).

These are all tasks with clear criteria, consistent processes, defined outputs.

The Skills that don't work as well are the ones that try to encode creative judgment. I experimented with a Skill for evaluating concept quality and it was... not useful. It could identify whether a design followed compositional principles, but it couldn't tell me if the concept was strategically sound or creatively distinctive.

Process and checklist work can be systematized. Creative judgment can't.

This Changes Where Your Expertise Actually Adds Value

Once Skills handle the first-pass compliance work and information synthesis, your role shifts.

I used to spend a lot of time on initial reviews. Catching guideline violations. Gathering information. Writing documentation. That work still happens, but now Claude (using my Skills) does the first pass.

I spend my time on second-order questions. Not "is the logo too small" but "does this design advance our positioning or just execute a tactic?" Not "where did I see that audience insight" but "how do we translate this insight into creative direction?" Not "what are the specs for this component" but "is this the right interaction pattern for this experience?"

The work that requires pattern recognition from seeing thousands of projects. The strategic decisions that depend on understanding the business context. The creative leaps that make work distinctive rather than just adequate.

That's where 18 years of experience actually matters. That's where I'm spending my time now.

You Can't Just Brain Dump Your Process

The hardest part of creating Skills isn't the technical setup. It's articulating your process clearly enough that it can be followed systematically.

I thought I could just describe how I check brand compliance and it would work. It didn't. I had to actually think through: What do I measure first and why? What's the threshold for flagging an issue? How do I distinguish between objective violations and subjective concerns? What context does someone need to act on this feedback?

This is valuable work even if you never create a Skill. Making your expertise explicit makes you better at teaching it to your team.

The Skills I use most are the ones where I was clearest about separating objective criteria from subjective judgment. When I tried to blend them, the Skill became inconsistent.

Skills Scale Process Knowledge Better Than Documentation

We've all worked at companies with brand guidelines that sit in a shared folder, unread. Process documentation that's technically available but rarely consulted. Templates that exist but get used inconsistently.

Skills are different because they're automatically invoked when relevant. You don't have to remember to check the brand guidelines. Claude loads the brand compliance Skill when you ask it to review a design. The knowledge gets applied consistently because the system prompts its use.

For creative leaders, this means your standards get maintained even when you're not personally reviewing every piece of work. Guideline violations get caught. Documentation follows consistent formats. Information gathering follows the same process.

That consistency compounds. Fewer revision cycles. Less rework. Higher baseline quality even when you're not in the room.

Some Things This Isn't

Skills handle process, checklists, information synthesis. They don't replace creative judgment, strategic decision making, or the ability to read room dynamics and manage stakeholder relationships.

I work with AI for creative production regularly (Midjourney for concept exploration, Veo3 for video production, custom GPTs for various specialized tasks). But Skills serve a different purpose. They're about automating the tedious work and creating consistent processes, not generating creative output.

They also make junior team members more effective faster because they have access to the same compliance checks, information gathering processes, documentation standards that senior team members use. But those team members still need to develop their own judgment over time. Access to good process doesn't replace experience.

The difference between asking Claude to check brand compliance and asking Claude to check brand compliance using your specific brand standards and flagging criteria is substantial. One gives you generic feedback that may or may not be relevant. The other gives you consistent evaluation against your actual standards.

Is This Worth Your Time?

The setup time per Skill is maybe 2 to 4 hours for something straightforward, longer for complex processes. I've created five Skills over the past month and spent probably 15 to 20 hours total, including iteration and refinement.

The time savings: I'm spending maybe 8 to 11 hours less per week on compliance checks, information gathering, documentation work. That's 32 to 44 hours per month. The ROI was positive after the first two weeks.

But the more significant benefit isn't time savings. It's consistency and focus.

Work gets evaluated against the same standards regardless of whether I personally reviewed it. Documentation follows the same format across designers and projects. Information gathering follows the same process for every brief. That consistency means fewer surprises, less rework, higher baseline quality.

And I'm spending my time on work that actually requires my expertise. The strategic creative direction. The judgment calls that come from seeing thousands of projects. The stakeholder relationships that determine whether good work actually ships.

If you're a creative leader who spends significant time on compliance checks, information gathering, documentation work, or reformatting content for different audiences, this is probably worth exploring.

How to Start

Don't try to build a comprehensive library of Skills immediately. Start with one task that eats your time without requiring your creative judgment.

For me, that was brand compliance checking. For you, it might be developer handoff documentation, production requirement synthesis, information gathering for creative briefs, stakeholder communication formatting, or something else entirely.

Pick one. Spend a few hours documenting how you actually do it. Create the Skill. Use it for a week. Refine it based on what works and what doesn't.

Then pick the next thing.

After five Skills, you'll have automated a meaningful portion of the tedious work that consumes your time without requiring your expertise. That's leverage that's hard to get any other way.

If you're already working with Skills for creative leadership work, I'm curious what you've built. The more we share frameworks, the less we all have to start from scratch.

Want to see more of my work?

Want to see more of my work?

I'm currently seeking Director/VP-level creative leadership roles at established tech/SaaS companies. My background includes:

  • Brand Transformation: Led award-winning rebrand at Celigo (GDUSA, Gold ADDY recognition) that saved $500K+ on a single project

  • Creative Operations: Built systems that increased team output 238% while maintaining quality

  • Strategic Innovation: Developed AI-powered tools and data-informed processes that connect creative excellence to measurable business impact


View my portfolio or connect with me on LinkedIn if you'd like to chat about creative leadership, operational excellence, or how to build more research-informed creative teams.

Find this interesting?

Schedule a Call.

Let's chat!

Find this interesting?

Find this interesting?

Schedule a Call.

Let's chat!

Find this interesting?

Find this interesting?

Schedule a Call.

Let's chat!

Find this interesting?

KENDAL RICHER

holler@kendalricher.com

Email copied!

330 459 4993

Cell phone copied!

3114 Woodland Trail
Avon, OH 44011

My current time

E T

Copyright © Kendal Richer

KENDAL RICHER

holler@kendalricher.com

Email copied!

330 459 4993

Cell phone copied!

3114 Woodland Trail
Avon, OH 44011

Copyright © Kendal Richer

KENDAL RICHER

holler@kendalricher.com

Email copied!

330 459 4993

Cell phone copied!

3114 Woodland Trail
Avon, OH 44011

Copyright © Kendal Richer