Back to Change Management
Change Management

30/60-Day Check-In Survey Template

Survey questions for team members affected by automation. What's working? What's frustrating?

30/60-Day Check-In Survey Template

Automation rollouts fail when firms treat implementation as a one-time event. You deploy the tool, run a training session, then wonder why adoption stalls at 40% by month three.

The fix: structured feedback loops at 30 and 60 days post-launch. This template gives you the exact questions to ask, the distribution method that gets 80%+ response rates, and the analysis framework to turn raw feedback into action items by Friday.

What This Template Does

This is a 15-question survey split into five sections: training effectiveness, productivity impact, user sentiment, technical blockers, and improvement suggestions. Each question uses a 5-point scale plus open-ended follow-ups.

You'll run it twice. First at day 30 to catch early adoption killers. Second at day 60 to measure whether your fixes worked.

The output: a one-page dashboard showing which teams are thriving, which are struggling, and exactly what to fix next week.

When to Deploy This Survey

Use this template when you've rolled out:

Document automation tools (HotDocs, Contract Express, Smokeball). Target: paralegals, associates, legal assistants handling high-volume document production.

Time capture automation (Clio, TimeSolv, BigTime). Target: attorneys and consultants who bill by the hour.

Client intake automation (Lawmatics, Lexicata, PracticePanther). Target: intake coordinators, client services teams, business development staff.

Research automation (ROSS Intelligence, Casetext, Fastcase). Target: associates and senior paralegals doing legal research.

Workflow automation (Zapier, Make, Power Automate connecting your practice management system to other tools). Target: operations staff, project managers, administrative teams.

Do NOT use this for minor feature updates or optional tools. Reserve it for changes that affect daily workflows for 10+ people.

Survey Distribution Protocol

Timing Windows

30-Day Survey: Send on day 28-30 after go-live. Earlier and users haven't formed real opinions. Later and you've missed the window to fix early problems.

60-Day Survey: Send on day 58-60. This captures the post-honeymoon reality when initial enthusiasm fades and true adoption patterns emerge.

Distribution Method

Use Microsoft Forms (if you're on M365) or Google Forms (if you're on Workspace). Both are free, both integrate with your existing systems, both allow anonymous responses while still letting you filter by department.

Skip SurveyMonkey unless you already have an enterprise license. The free tier caps at 10 questions and limits exports.

The Email Template

Subject: [2 minutes] How's [Tool Name] working for you?

Body: "We rolled out [Tool Name] four weeks ago. I need your honest feedback on what's working and what's broken.

This survey takes 2 minutes. Your responses are anonymous. I'm reading every answer and will share what we're fixing by [specific date].

[Survey Link]

Thanks, [Your Name]"

Send from a partner or department head, not from IT or operations. Response rates jump 20-30% when the request comes from someone with authority to actually fix problems.

Response Rate Targets

Aim for 75% minimum. Below that and you're getting skewed data from only the most frustrated or most enthusiastic users.

If you're at 50% after 3 days, send one reminder. If you're still below 60% after 5 days, make the survey a standing agenda item in your next team meeting and have people complete it live.

The Survey Questions

Copy this into your survey tool. Replace [TOOL NAME] with your actual tool name. Replace [OLD PROCESS] with whatever this tool replaced.

Section 1: Training Effectiveness

Q1: The training prepared me to use [TOOL NAME] in my daily work.

  • Strongly agree
  • Agree
  • Neutral
  • Disagree
  • Strongly disagree

Q2: What specific part of the training was most useful? [Open text field]

Q3: What should we have covered in training but didn't? [Open text field]

Section 2: Productivity Impact

Q4: Compared to [OLD PROCESS], [TOOL NAME] has made me:

  • Much more productive (saving 2+ hours per week)
  • Somewhat more productive (saving 30-120 minutes per week)
  • About the same
  • Somewhat less productive (losing 30-120 minutes per week)
  • Much less productive (losing 2+ hours per week)

Q5: Which specific tasks are now faster because of [TOOL NAME]? [Open text field]

Q6: Which tasks are now slower or more complicated? [Open text field]

Section 3: Daily Usage Reality

Q7: I use [TOOL NAME] for the tasks it was designed for:

  • Always (90-100% of the time)
  • Usually (60-89% of the time)
  • Sometimes (30-59% of the time)
  • Rarely (10-29% of the time)
  • Never (0-9% of the time)

Q8: When I don't use [TOOL NAME], it's because: [Open text field - this question reveals your real adoption blockers]

Q9: The tool does what I need it to do:

  • Strongly agree
  • Agree
  • Neutral
  • Disagree
  • Strongly disagree

Section 4: Technical Issues

Q10: I've experienced technical problems with [TOOL NAME]:

  • Never
  • Once or twice
  • Weekly
  • Daily
  • Multiple times per day

Q11: Describe the most frustrating technical issue you've encountered: [Open text field]

Q12: When I have a problem with [TOOL NAME], I know where to get help:

  • Strongly agree
  • Agree
  • Neutral
  • Disagree
  • Strongly disagree

Section 5: Support and Next Steps

Q13: The support I've received for [TOOL NAME] has been:

  • Excellent
  • Good
  • Adequate
  • Poor
  • Terrible

Q14: What one change would make [TOOL NAME] work better for you? [Open text field - this is your priority list]

Q15: Any other feedback? [Open text field]

Analysis Framework

Don't just read the responses. Run this analysis within 48 hours of closing the survey.

Step 1: Flag Critical Issues

Any response indicating "much less productive" or "multiple times per day" technical problems gets flagged for immediate follow-up. Even if responses are anonymous, you can often identify the team or role based on the described workflow.

Schedule 15-minute calls with affected users within one week.

Step 2: Calculate Adoption Score

Count responses to Q7. Your adoption score is the percentage who answered "Always" or "Usually."

  • 80%+ = Healthy adoption, focus on optimization
  • 60-79% = Moderate adoption, investigate Q8 responses
  • Below 60% = Adoption crisis, halt any expansion plans

Step 3: Identify Training Gaps

Read every Q3 response. Group similar answers. If 5+ people mention the same missing training topic, schedule a supplemental training session within two weeks.

Step 4: Build Your Fix List

Export all Q14 responses. Use this prompt in ChatGPT or Claude:

"I'm analyzing feedback on a new tool rollout. Here are all the 'one change' suggestions from our team. Group these into themes, rank by frequency, and identify the top 3 changes we should prioritize: [paste all Q14 responses]"

Step 5: Create Your Dashboard

Build a one-page summary with:

  • Overall adoption score (from Q7)
  • Net productivity impact (% more productive minus % less productive from Q4)
  • Top 3 technical issues (from Q11)
  • Top 3 requested changes (from Q14)
  • Comparison to 30-day results (for the 60-day survey)

What to Do With the Results

Share the dashboard with your team within one week of closing the survey. Include:

  1. What you heard (the top 3-5 themes)
  2. What you're fixing immediately (with specific deadlines)
  3. What you're investigating (issues that need more analysis)
  4. What you're not changing (and why)

The last point matters. If people request features the tool doesn't have or changes that would break other workflows, explain that clearly. Silence on feedback is worse than saying no.

Schedule fixes for the next 30 days. When you run the 60-day survey, you should see measurable improvement in the areas you addressed.

If your 60-day results are worse than your 30-day results, you have a fundamental tool fit problem. That's a different conversation, but at least you'll know within two months instead of six.

Revenue Institute

Reviewed by Revenue Institute

This guide is actively maintained and reviewed by the implementation experts at Revenue Institute. As the creators of The AI Workforce Playbook, we test and deploy these exact frameworks for professional services firms scaling without new headcount.

Revenue Institute

Need help turning this guide into reality? Revenue Institute builds and implements the AI workforce for professional services firms.

RevenueInstitute.com