I remember the first time I discovered my site had been hit by a negative SEO attack—traffic tanked, rankings dropped, and my inbox filled with weird messages. It felt personal, like someone had sabotaged weeks (or months) of careful work. Since then, I’ve handled several attacks and helped clients recover. Below is an exact, actionable checklist I use every time to clean up, reclaim rankings, and harden sites against future hits. Follow it step by step; treat it like triage first, then surgery and finally ongoing care.

Initial triage: confirm the attack and contain damage

Before doing anything drastic, you need to confirm that what you’re seeing is indeed a negative SEO attack and not an algorithm update, lost backlinks due to a third-party change, or seasonal fluctuation.

  • Check Google Search Console (GSC): Look for manual actions, security issues, and a sudden increase in 404s, soft 404s, or index coverage errors. GSC will often show site-wide problems that are not visible in analytics.
  • Compare organic traffic and ranking drops to known algorithm updates: Use resources like the Google Search Central Blog, Moz, or the SEO community on Twitter to rule out algorithm shifts.
  • Inspect backlinks: Run a quick backlink snapshot in tools like Ahrefs, Semrush, or Majestic. Look for a sudden spike of low-quality, spammy links from the same IPs or domains with porn, gambling, or unrelated foreign-language content.
  • Scan for hacked content or injected malware: Use Sucuri SiteCheck, your host’s scanning tools, or the “Security Issues” report in GSC. If hack/malware is present, isolate the site (maintenance page) while cleaning.

Step 1 — Backlink audit and triage

Backlinks are the usual weapon in negative SEO. The goal here is to identify malicious links, document them, attempt removal, and prepare a disavow file.

  • Create a complete backlink list: Export links from GSC and at least one third-party tool (Ahrefs or Semrush). Combine and dedupe spreadsheets.
  • Filter for suspicious patterns: Mass links from newly created domains, exact-match anchor text spam, links from sites with zero organic traffic, foreign/malicious content, or link farms.
  • Log details: For each suspicious domain, record the source URL, anchor text, date found, and why it’s suspicious. This audit trail is crucial if you need to file a reconsideration request or contact Google.
  • Attempt manual removal: Reach out to webmasters asking for link removal. Use a clear template and include the exact URL of the backlink. Note response attempts and dates in your log.
  • Prepare a disavow file: After removal attempts (or immediately if the volume is huge), create a disavow list in the format Google requires: domain:example.com or the exact URL. Keep a copy offline and upload via GSC only when you’re ready.

Step 2 — Clean hacked content and security hardening

If the attack included hacked pages, injected links, or content cloaking, you must clean all infections before reclaiming rankings.

  • Restore from a clean backup: If available, restore to a pre-hack backup. If not, manually remove injected files, suspicious cron jobs, and unknown admin users.
  • Update everything: Core CMS (WordPress, Magento, etc.), plugins, themes, and server software. Vulnerable software is the usual entry point.
  • Change credentials: Reset passwords for FTP, SSH, database, CMS admins, and any API tokens. Use strong, unique passwords or password managers like 1Password or Bitwarden.
  • Harden server settings: Disable directory listing, restrict file permissions, and implement WAF rules (Cloudflare, Sucuri, or your host’s firewall).
  • Install monitoring and alerting: Use uptime monitors (UptimeRobot), integrity checkers, and security plugins that notify you of file changes.

Step 3 — Clean up on-site SEO and indexation

Once your site is secure, make sure only the correct pages are indexed and no spammy pages are visible to search engines.

  • Audit index coverage: In GSC, check which URLs are indexed. Remove spammy or injected pages via 404/410, noindex, or robots.txt (careful: robots.txt blocks crawling but won’t remove from the index instantly).
  • Use canonical and noindex wisely: Ensure canonical tags point to the right versions. Use noindex for thin or duplicate pages introduced during the attack.
  • Check sitemap: Regenerate and re-submit your sitemap after cleaning. Make sure it only includes canonical, indexable pages.
  • Repair internal linking: Remove internal links to spam pages. Internal linking to malicious pages can propagate authority loss.

Step 4 — Reclaim lost rankings and content recovery

Reclaiming rankings is a mix of technical fixes and content work. Expect this to be gradual — rarely is it instantaneous.

  • Request reconsideration if penalized: If you received a manual action, file a reconsideration request with details of the clean-up: what you removed, your disavow file, and remediation steps.
  • Refresh and republish priority content: Pick your most valuable pages and update them with new data, better structure, and improved user experience. Fresh, high-quality content helps regain trust.
  • Rebuild authoritative links: Launch a PR and outreach campaign to earn high-quality backlinks: guest posts, partnerships, resource pages, or HARO. One-to-one outreach to previously linking sites asking to re-link can work.
  • Monitor rankings and clicks: Use GSC, GA4, and rank trackers to measure recovery. Track keyword clusters and pages rather than obsessing over a few keywords.

Step 5 — Prevent future attacks

Once you’re stable, implement long-term protections and monitoring. Prevention is always cheaper than recovery.

  • Implement rate limits and bot management: Services like Cloudflare or Google Cloud Armor can mitigate mass link scraping or abusive crawling patterns.
  • Regular backlink audits: Schedule quarterly backlink reviews. Early detection makes disavow/dispute efforts more effective.
  • Version-control your site: Use Git or another SCM, and deploy via CI/CD so unintended file changes are obvious and reversible.
  • Educate your team: Train admins and content editors on security best practices, phishing awareness, and when to escalate anomalies.
  • Keep an incident log: Document every incident—what happened, how you detected it, what you did, and the timeline. This becomes your playbook for future events and is useful for transparency with stakeholders or Google.

Useful tools and templates I use

Here are the tools I rely on and a few templates you can adapt:

  • Backlink and audit: Ahrefs, Semrush, Majestic, Google Search Console
  • Security: Sucuri, Wordfence (for WordPress), Cloudflare, SiteLock
  • Monitoring: UptimeRobot, Datadog, Google Alerts
  • Disavow process: Keep a spreadsheet with columns: domain, source URL, anchor text, attempted removal date, response, status, disavow reason.
  • Outreach template (short): “Hello, I manage content for inbound-seo.uk. A link from your page [URL] to one of our pages appears to be low-quality/spammy and was not created by us. Could you please remove it? Thanks—Élodie.”

Negative SEO is stressful, but it’s manageable with a calm, methodical approach. Prioritise safety, document everything, and move from immediate damage control to long-term resilience. I’ve seen sites fully recover when owners follow these steps, and you can too.