Reviews & Reputation Management
Review acquisition tactics that comply with platform rules, response templates for positive and negative reviews, the Review schema example Google validates, the platforms that matter beyond GBP, and what to do about fake reviews.
Reviews are the single most visible local SEO signal — they appear in the local pack, on Maps, in the Knowledge Panel, and in AI Overview merchant carousels. They also drive conversion: 87% of users read reviews before contacting a local business (BrightLocal Consumer Survey 2024). Good reputation management is a continuous workflow, not a one-time ask. This module is the playbook.
TL;DR
- Review velocity, recency, and rating are all signals. A 4.2 average with 200 reviews and 5 in the last 30 days outranks a 4.9 average with 12 reviews from 2022. Recency matters as much as average rating.
- Acquisition has rules. Asking customers for reviews is fine. Incentivizing reviews, gating reviews (“only happy customers please”), or buying them violates Google’s review policy and triggers removal — and sometimes profile suspension.
- Respond to every review within 7 days. Response rate is a documented engagement signal Google uses, and it materially affects future review tone.
The mental model
Reviews are like referee scores in figure skating. The single highest score does not win the medal — the cumulative scoring across the program does. And the scores themselves are visible to the audience, so they shape perception of the next performance even before the routine starts.
A new prospect researching your business sees an aggregate rating, a recent reviews snapshot, and your responses. Each component does work. The aggregate is the headline; the recent reviews are the live read; the responses tell the prospect how you handle feedback. Manage all three.
Deep dive: the 2026 reality
What Google measures
| Signal | Weight |
|---|---|
| Total review count | High — but with diminishing returns past ~200 |
| Average rating | Very high — 4.0+ is the minimum competitive threshold |
| Recency of recent reviews | High — 30/60/90 day distribution matters |
| Velocity | Medium — sudden spikes look unnatural; steady cadence is healthy |
| Review text quality | Medium — long, specific reviews carry more weight than “Great!” |
| Response rate from owner | Medium — engagement signal |
| Sentiment in review text | Increasing — Google uses NLP to extract sentiment beyond stars |
| Reviewer profile authority | Low-medium — Local Guides count slightly more |
| Distribution across platforms | Medium — diverse reviews beat all-Google |
Acquisition tactics (compliant)
| Tactic | Effort | Conversion to review |
|---|---|---|
| In-person request at point of sale | Low | 20-35% |
| SMS follow-up after purchase | Medium | 12-25% |
| Email follow-up with deep link | Medium | 5-15% |
| QR code on receipt | Low | 3-8% |
| Tablet kiosk at exit | Medium | 8-15% |
| Post-service automated email | High setup | 10-20% |
The deep link to leave a Google review is built from your Place ID:
https://search.google.com/local/writereview?placeid=YOUR_PLACE_ID
Get your Place ID from developers.google.com/maps/documentation/places/web-service/place-id or the GBP “Get more reviews” share link in your dashboard.
Sample SMS template (under 160 characters):
Hi [Name], thanks for choosing Hill Country Dental today.
A quick review helps us a lot — takes 30 seconds.
[short link]
What violates Google’s review policy
Per Google’s official policy (support.google.com/contributionpolicy/answer/7400114):
| Violation | Common in the wild | Penalty |
|---|---|---|
| Incentivizing reviews (“$5 off for a review”) | Yes | Review removal, profile warning |
| Gating reviews (“if you had a great experience, leave a review; if not, contact us”) | Very common | Profile suspension risk |
| Soliciting reviews on a kiosk where the same IP submits 50+ | Yes | Reviews flagged as fake |
| Fake reviews from staff, family, or paid third parties | Yes | Removal + suspension |
| Self-reviews from the owner account | Yes | Profile penalty |
| Reviews from competitors disparaging your business | Common | Reportable for removal |
Google’s spam detection uses IP, account age, review velocity, and text fingerprinting. Buying reviews from Fiverr or “review services” is the fastest way to lose your profile — modern detection catches the entire batch in 30-90 days.
Responding to reviews
Positive reviews
Acknowledge specifics. Avoid copy-paste responses; Google’s NLP detects boilerplate and discounts it.
Thank you, Sarah. We're glad the same-day crown worked out — it's
exactly the kind of case CEREC technology is built for. We'll
share your kind words with Dr. Patel and the front desk team.
See you at your next cleaning.
Negative reviews
The framework: acknowledge, contextualize without arguing, take it offline.
Hi James, thank you for sharing this. Long wait times during our
peak hours are a real issue we're working on with new scheduling
software rolling out this month. I'd like to make this right —
please email me directly at owner@hillcountrydental.com so I can
look at your visit and respond personally.
— Maria, owner
Do not litigate the facts publicly. Future readers see how you handle complaints, not who was technically right.
Reviews you cannot respond to (yet)
If Google has not yet shown you the review (sometimes there is a 24-48h delay), wait. Responding to a review you found via Google Alert before it appears in your dashboard creates a duplicate response.
Review schema
Marking up reviews on your own pages (e.g., a testimonial page) lets them appear in rich results — though Google has restricted how aggregateRating shows for local-business pages since 2019 (your own self-published reviews on your site are no longer eligible for stars in the SERP for local-business types; only third-party-aggregated review markup is). Still, the markup is read by AI Overviews and other crawlers.
{
"@context": "https://schema.org",
"@type": "Dentist",
"name": "Hill Country Dental",
"url": "https://hillcountrydental.com",
"aggregateRating": {
"@type": "AggregateRating",
"ratingValue": "4.9",
"bestRating": "5",
"worstRating": "1",
"ratingCount": "127"
},
"review": [{
"@type": "Review",
"author": { "@type": "Person", "name": "Sarah Patel" },
"datePublished": "2026-04-12",
"reviewBody": "Same-day crown saved a trip. Dr. Patel was clear about pricing upfront.",
"reviewRating": {
"@type": "Rating",
"ratingValue": "5",
"bestRating": "5"
}
}]
}
Google’s eligibility rules: for LocalBusiness types, aggregateRating from the business itself is mostly ignored. Use third-party aggregator data (e.g., from Trustpilot, G2) where available. For products, services, courses, recipes, books, software apps — review markup is fully eligible for star rich results.
Validate via Schema Markup Validator (validator.schema.org) and Rich Results Test.
Platforms beyond Google
| Platform | Audience | Why it matters |
|---|---|---|
| Yelp | Restaurants, services, retail | Apple Maps + Bing rely on Yelp data |
| Trustpilot | E-commerce, services | Schema flows into rich results |
| G2 | B2B SaaS | Critical for software buyer journey |
| Capterra | B2B SaaS | Same |
| TripAdvisor | Hospitality, travel | Dominant in the vertical |
| BBB | Trust signal across categories | Older audience, traditional businesses |
| All consumer | Recommendations feed Meta search | |
| Healthgrades / Vitals / ZocDoc | Medical | Vertical-specific |
| Avvo | Legal | Vertical-specific |
| Houzz / Angi / HomeAdvisor | Home services | Vertical-specific |
Fake review identification and removal
Indicators of fake reviews:
- Reviewer with only one review and no profile photo
- Reviews dated within minutes of each other
- Generic text that could apply to any business
- Reviews from accounts with similar naming patterns
- Spike in 1-star reviews matching a competitor’s product launch
Removal workflow:
- Flag in GBP — open the review > three-dot menu > Flag as inappropriate
- Use the Business Profile support form for batch reports (
support.google.com/business/contact/inappropriate_review_appeal) - Document the pattern with screenshots, timestamps, and reviewer profile data
- Escalate via Twitter @GoogleMyBiz (now @GoogleBusinessProfile in 2026) for visibility
Google removes around 30-50% of reported reviews on first try. Reappeal with more evidence if denied. For coordinated review-bombing attacks (covered in Module 51), maintain a documentation log throughout.
Visualizing it
flowchart TD
A[Customer interaction] --> B[Service delivered]
B --> C{Acquisition channel}
C --> D[In-person + QR]
C --> E[SMS follow-up]
C --> F[Email follow-up]
D --> G[Customer leaves review]
E --> G
F --> G
G --> H{Sentiment}
H -->|Positive| I[Acknowledge specifics in response]
H -->|Negative| J[Acknowledge + take offline]
H -->|Suspected fake| K[Flag for review by Google]
I --> L[Aggregate signals to GBP + AI surfaces]
J --> L
K --> M[Removal request]
L --> N[Local pack visibility + conversion]
Bad vs. expert
The bad approach
A restaurant owner places a tablet at the exit with the prompt: “Did you have a great experience? Tap YES to leave a Google review. Tap NO to send us private feedback.”
The flow:
Tap YES -> deep link to Google review form
Tap NO -> internal email form, never reaches Google
Google reviews skew 4.7 stars from 240 reviews. Six months later the profile is suspended. The owner files reinstatement, gets denied. Cause: review gating is an explicit policy violation. Google’s quality team detected it (likely via an undercover diner test or competitor report).
This fails because manipulating which customers can leave a review violates the platform terms. It is also detectable — Google’s spam team runs proactive policy enforcement.
The expert approach
Same restaurant. Compliant workflow:
review_workflow:
channels:
- point_of_sale_qr_card
- sms_24h_after_visit
- email_3d_after_visit
template_variants: 6 # rotate to avoid spam fingerprinting
exclusions:
- same_phone_within_30d # avoid double-asking
- guests_who_complained_in_person # avoid bait-and-switch
ask_text: "How was your visit?" # neutral, no gating
destination: google + yelp + tripadvisor (rotate weekly)
response:
sla_hours: 48
template_library: 12_variants_by_situation
After 12 months: 423 reviews on Google (4.6 avg), 187 on Yelp (4.4), 92 on TripAdvisor (4.5). No suspensions. Local pack #1 for “[neighborhood] restaurants”. Profile resilient to occasional 1-star reviews because the volume cushion absorbs them.
This works because asking everyone (compliant), responding consistently (engagement signal), and diversifying platforms (cross-source trust) compound over time without rule-breaking.
Do this today
- Find your Place ID at
developers.google.com/maps/documentation/places/web-service/place-id. Build the deep link:https://search.google.com/local/writereview?placeid=YOUR_PLACE_ID. - Generate a QR code (use
qr-code-generator.com) for the deep link. Print 50 cards. Place at point of sale, on receipts, and in customer takeaway materials. - Set up an SMS follow-up via Podium, Birdeye, NiceJob, or Trustpilot for SMS. Start with a single template and rotate to 3+ variants after 30 days.
- Set up an email follow-up via your email automation tool 3-5 days after service or purchase. Subject line should be conversational (“how did we do?”), not transactional.
- Audit your current request flow against Google’s review policy (
support.google.com/contributionpolicy/answer/7400114). Eliminate any incentives, gating, or selective targeting. - Build a response template library with 8-12 variants for common situations. Assign daily review monitoring to one person with a 48-hour response SLA.
- Add third-party Review schema (Trustpilot widget or G2 widget for B2B SaaS) to your homepage. Validate in Rich Results Test.
- Open accounts on the relevant secondary platforms for your category (Yelp, Trustpilot, TripAdvisor, G2, Capterra, BBB, vertical-specific). Verify ownership and configure response notifications on each.
- Search Google Business Profile for any reviews you suspect are fake (1-review reviewer accounts, generic praise/complaint text, reviews clustered within minutes). Flag with Flag as inappropriate. Document for follow-up.
- Track monthly: review velocity, average rating per platform, response rate, response time. If velocity drops 30%+, fix the acquisition flow before anything else.
Mark complete
Toggle to remember this module as mastered. Saved to your browser only.
More in this part