Technical SEO Checklist: 15 Critical Issues Most Sites Miss | AuditMySite
Technical SEO Is the Foundation Everything Else Sits On
You can write the best content in your industry and build thousands of backlinks, but if Google cannot efficiently crawl, index, and understand your site, none of it matters. In our experience auditing 500+ websites over the past three years, we find that 82% of sites have at least 5 of these 15 critical technical SEO issues. Most have 8 or more.
This checklist is ordered by impact — fix #1 before worrying about #15.
1. Crawl Budget Waste from Non-Indexable URLs
Google allocates a crawl budget to every site. When Googlebot spends that budget crawling URLs that should not be indexed — parameter URLs, faceted navigation pages, session ID URLs — your important pages get crawled less frequently.
How to check: Review your Google Search Console Coverage report and server logs. If Googlebot is crawling thousands of URLs that return noindex or redirect, you are wasting crawl budget.
Fix: Add these URLs to robots.txt or use the x-robots-tag: noindex header. For faceted navigation, implement rel=canonical pointing to the parent category page.
Impact: Sites that cleaned up crawl budget waste saw important pages indexed 40-60% faster on average.
2. Missing or Incorrect Canonical Tags
Canonical tags tell Google which version of a page is the master copy. When they are missing, Google guesses — and it often guesses wrong, splitting your ranking signals across duplicate URLs.
Common mistakes:
- Self-referencing canonicals missing on key pages
- HTTP pages canonicalizing to HTTP instead of HTTPS
- Paginated pages all canonicalizing to page 1 (incorrect for most implementations)
- Canonical pointing to a 404 or redirected URL
How to check: Crawl your site with Screaming Frog or Sitebulb. Filter for pages where the canonical URL differs from the page URL and verify each is intentional.
3. Orphan Pages with No Internal Links
An orphan page has no internal links pointing to it. Google discovers pages primarily through links, so orphan pages may never be crawled — or may be crawled so infrequently that they never rank.
How to find them: Compare your XML sitemap URLs against your crawl data. Any URL in the sitemap that was not discovered during crawling is an orphan page.
Impact: We have seen orphan pages go from zero traffic to 500+ monthly visits simply by adding 3-5 internal links from relevant, authoritative pages on the same site.
4. Slow Server Response Times (TTFB over 1 second)
If your server takes more than 1 second to respond, every other performance optimization is fighting an uphill battle. Target TTFB under 200ms for optimal performance.
Common causes: Unoptimized database queries, no server-side caching, shared hosting with noisy neighbors, no CDN for geographically distributed audiences.
Quick wins: Implement page caching (WP Super Cache, Varnish, or Cloudflare APO for WordPress), move to a better hosting provider, and use a CDN for static assets.
5. Missing or Poorly Implemented Hreflang Tags
If you serve content in multiple languages or to multiple regions, hreflang tells Google which version to show to which audience. We find errors in 73% of hreflang implementations we audit.
Most common errors:
- Missing return tags (if page A points to page B, page B must point back to page A)
- Using incorrect language/region codes (en-UK instead of en-GB)
- Hreflang pointing to non-canonical URLs
- Missing x-default hreflang
6. Broken Internal Links (404 Chains)
Every 404 error encountered during a crawl wastes crawl budget and passes zero link equity. Worse, chains of redirects that ultimately lead to 404s create what we call dead-end crawl traps.
How to check: Run a full site crawl and filter for 4xx status codes. Pay special attention to links in global navigation and footer — a single broken link there multiplies across every page.
Impact: Fixing broken internal links on a 10,000-page e-commerce site increased crawl rate by 28% and indexed pages by 12% within 30 days.
7. Unoptimized XML Sitemaps
Your XML sitemap should be a curated list of your most important indexable pages — not a dump of every URL on your site.
Best practices:
- Only include pages that return 200 status and are indexable (no noindex, no canonicalized-away pages)
- Include lastmod dates that reflect actual content changes
- Keep sitemaps under 50,000 URLs and 50MB (split into multiple sitemaps if needed)
- Submit via Google Search Console and reference in robots.txt
8. Missing Structured Data (Schema Markup)
Schema markup does not directly improve rankings, but it enables rich results that dramatically improve click-through rates. Pages with rich results see 35-50% higher CTR than standard blue links.
Priority schemas by site type:
- E-commerce: Product, Offer, AggregateRating, BreadcrumbList
- Local business: LocalBusiness, OpeningHoursSpecification, GeoCoordinates
- Blog/content: Article, FAQPage, HowTo, BreadcrumbList
- Restaurant: Restaurant, Menu, Review
Use Google Rich Results Test to validate your implementation and Schema.org documentation as reference.
9. JavaScript Rendering Issues
Google can render JavaScript, but it does so in a two-wave indexing process. Critical content loaded via JavaScript may not be indexed for days or weeks after the initial crawl.
How to check: Use Google Search Console URL Inspection tool and compare the rendered HTML with your source HTML. If important content is missing from the initial HTML, you have a JS rendering issue.
Solutions: Server-side rendering (SSR), static site generation (SSG), or dynamic rendering for Googlebot. Next.js, Nuxt, and SvelteKit all handle this well out of the box.
10. HTTP to HTTPS Migration Issues
Even in 2026, we find sites with incomplete HTTPS migration. Mixed content warnings, HTTP resources loaded on HTTPS pages, and missing redirects from HTTP to HTTPS all cause problems.
Quick audit: Use Why No Padlock or Chrome DevTools Security tab to identify mixed content. Ensure every HTTP URL 301-redirects to its HTTPS equivalent.
11. Thin Content Pages Diluting Site Quality
Pages with fewer than 200 words of unique content can be flagged as thin content. When a significant portion of your site is thin, it drags down the perceived quality of the entire domain.
Fix: Either expand thin pages with valuable content, consolidate them into more comprehensive pages, or noindex them if they serve a UX purpose but not an SEO one.
12. Missing Image Alt Text and Optimization
Images without alt text are invisible to Google Image Search and accessibility tools. We typically find 40-60% of images on audited sites are missing alt text.
Fix: Add descriptive, keyword-relevant alt text to every meaningful image. Compress all images (target under 200KB for most), use modern formats (WebP, AVIF), and implement lazy loading for below-the-fold images.
13. Redirect Chains and Loops
When URL A redirects to URL B which redirects to URL C, that is a redirect chain. Each hop loses approximately 10-15% of link equity and slows crawling. Redirect loops (A to B to A) completely block crawling.
Fix: Update all redirect chains to point directly to the final destination. Most CMS platforms and server configurations make this straightforward.
14. Incorrect Robots.txt Configuration
Robots.txt errors can block Google from your most important pages. We have audited sites that accidentally blocked their entire /products/ directory or their CSS/JS files (preventing rendering).
How to check: Use Google Search Console robots.txt Tester. Verify that critical pages, CSS, and JavaScript files are all accessible.
15. Poor Mobile Usability
With Google mobile-first indexing, your mobile site IS your site. Common mobile issues include:
- Touch targets too close together (minimum 48x48px with 8px spacing)
- Content wider than screen causing horizontal scrolling
- Text too small to read (minimum 16px for body text)
- Interstitials blocking content on mobile
Putting It All Together
Technical SEO is not glamorous, but it is the foundation that makes everything else work. Before investing in content creation or link building, ensure your technical house is in order.
A strong technical foundation also supports your broader brand strategy. Your site is often the first impression of your brand — and if it is slow, broken, or poorly structured, no amount of beautiful branding can compensate. The team at BrandScout frequently works with businesses that have invested heavily in brand identity only to discover their website is undermining it with technical issues.
For Sacramento-area businesses especially, local search is hyper-competitive. Contractors and home service companies need every edge. SacValley understands the local search landscape and can help ensure your technical SEO is optimized for the Sacramento market specifically.
Start at the top of this list, work your way down, and reaudit quarterly. Technical SEO is not a one-time fix — it is an ongoing discipline.
Ready to audit your site?
Run a free SEO scan and get actionable recommendations in seconds.
Start Free Scan →