"Technical SEO" sounds like the part of search optimization where you need a developer on retainer. For most small businesses, it isn't. There are seven things Google actually cares about that a non-developer can check in a single afternoon. Some of them you can fix yourself with a free tool and 15 minutes. Some you'll need help with - but you'll know which ones.
We've run a lot of audits on small-business sites. The same seven items come up over and over. The same fixes work. This post walks each one in order: what it is in plain English, the free tool that checks it, the most common way it goes wrong on a small-business site, and exactly what to do about it.
Block out three hours. You won't need all of them.
Why technical SEO is the 20% that does 80% of the work
On-page content matters. Backlinks matter. But neither will rank if the technical foundation is broken. Google has to be able to crawl your site, render it, trust that it's secure, understand what each page is, and serve it fast on a phone. If any of those break, your good content sits there invisible.
The seven checks below cover that whole foundation. Two of them are infrastructure (HTTPS, mobile). Two are crawling instructions (robots.txt, sitemap.xml). One is speed (Core Web Vitals). Two are clarity for Google (canonical tags, structured data). Run all seven once and you'll know whether your site has technical SEO problems or not. You'll also know which problems you can fix and which need a developer.
The 7 checks
1. robots.txt - the file that tells Google which pages to ignore
What it is: A plain text file at yoursite.com/robots.txt. It's a one-page instruction sheet for search engine crawlers - which folders or pages they're allowed to visit, which they should skip.
Free tool: Just type your domain plus /robots.txt into a browser. The whole file shows up. Google Search Console also has a robots.txt report under Settings.
Most common SMB failure: A robots.txt file left over from a staging site that says Disallow: / - which tells Google not to crawl anything. We see this on roughly 1 in 30 audits. The site has been live for months and nobody noticed traffic was zero. The other common failure is the opposite: no robots.txt at all, plus admin folders and search-result pages getting indexed and competing with real content.
The fix: If you see Disallow: / and your site is live, that line needs to go. If you have no robots.txt, the simplest version that works for most small businesses is:
User-agent: *
Allow: /
Sitemap: https://yoursite.com/sitemap.xml
Most modern site builders (Squarespace, Webflow, Shopify, WordPress with Yoast or Rank Math) generate a sane robots.txt automatically. Check it once. Fix it if it's wrong. Move on.
2. sitemap.xml - the menu you hand to Google
What it is: An XML file that lists every page on your site you want Google to know about. It's the menu you give the waiter so they don't have to guess what's on offer.
Free tool: Google Search Console - the Sitemaps report under Indexing. Submit your sitemap URL there and Google tells you which URLs it discovered, which it indexed, and which it had problems with.
Most common SMB failure: The sitemap exists but it's never been submitted to Search Console, so Google is finding pages slowly through links instead of getting the full list at once. Second most common: the sitemap is there but it includes pages that 404 or redirect, which makes Google distrust the whole file.
The fix: Find your sitemap (it's almost always at yoursite.com/sitemap.xml). Open Search Console. Go to Indexing > Sitemaps. Paste the URL. Click Submit. Wait 48 hours, then come back and check the report - it'll tell you if any URLs failed and why. Fix or remove those URLs. Resubmit.
3. Core Web Vitals - how fast your site feels on a phone
What it is: Three speed measurements Google uses to judge whether your page provides a good user experience.
- LCP (Largest Contentful Paint): how long until the biggest thing on the page (usually the hero image or main headline) finishes loading. Target: under 2.5 seconds.
- INP (Interaction to Next Paint): how long the page takes to respond when someone taps a button or types in a field. This replaced FID in March 2024 and is the trickiest of the three because it measures real user interactions, not just page load. Target: under 200 milliseconds.
- CLS (Cumulative Layout Shift): how much stuff jumps around as the page loads. The classic offender is an ad or image loading late and pushing the text you were about to read down the screen. Target: under 0.1.
Free tool: PageSpeed Insights. Paste your URL, get a report for mobile and desktop with the three scores plus a list of specific things slowing the page down.
Most common SMB failure: Hero images that are 4MB JPEGs uploaded straight from an iPhone. They blow LCP out of the water and there's no other technical reason the site is slow. Second most common: too many third-party scripts (chat widgets, popup tools, analytics, retargeting pixels) that each add 100-300ms to INP.
The fix: Compress hero and product images to under 200KB each. TinyPNG does it for free, no account. Use WebP format if your site builder supports it. Then audit your third-party scripts - if you have a chat widget you don't actively use, an exit-intent popup that fires once a year, and three different analytics tools, kill the ones you don't need. Run PageSpeed Insights again. You'll usually see 10-20 points of improvement just from those two passes.
4. Mobile-friendliness - because Google indexes your mobile site, not your desktop one
What it is: Whether your site works well on a phone. Since 2019, Google indexes the mobile version of your site by default. The desktop version is mostly ignored for ranking. If your mobile site is broken or hard to use, you're invisible.
Free tool: Google Search Console's Mobile Usability report. It flags specific issues across your site - tap targets too close together, text too small to read, content wider than the screen.
Most common SMB failure: A "mobile-responsive" theme that technically works but has tap targets jammed too close together (links and buttons that are easy to misclick on a thumb), or font sizes set in fixed pixels that come out as 11px on a phone. Also common: pop-ups that cover the entire mobile screen with no obvious close button, which Google treats as an interstitial penalty.
The fix: Open your homepage on your actual phone. Try to tap every link and button without making a mistake. Try to read every paragraph without zooming. If you can't, those are real problems and the Mobile Usability report will name them. Most modern themes have a mobile-specific settings panel where you can bump up tap target sizes and base font size to 16px or higher.
5. HTTPS - the padlock in the browser bar
What it is: HTTPS encrypts the connection between your visitor's browser and your site. The "S" stands for "Secure." It shows up as a padlock icon in the address bar. Sites without it get marked "Not Secure" in Chrome and most other browsers.
Free tool: Just visit your site. Look at the address bar. Padlock means HTTPS. "Not Secure" warning means no HTTPS. For a deeper check, SSL Labs gives your certificate a letter grade and flags configuration problems.
Most common SMB failure: A site with a valid HTTPS certificate that still serves images, CSS, or JavaScript over HTTP - "mixed content." Browsers show a broken-padlock warning for these pages. The other failure mode: an expired certificate. We've seen businesses lose two weeks of traffic because nobody noticed the cert expired and the site was throwing security warnings to every visitor.
The fix: If you don't have HTTPS at all, every modern hosting platform (Squarespace, Shopify, Webflow, WordPress on managed hosting) provides free certificates through Let's Encrypt with one click. Turn it on. If you have HTTPS but mixed content, you need to find any hardcoded http:// URLs in your content or theme and update them to https://. Most CMSes have a search-and-replace tool for this.
The local-search angle: Google has explicitly said HTTPS is a ranking signal for local businesses competing in Google Maps. If you and your closest competitor are matched on everything else and they have HTTPS and you don't, they win. For a local business that lives or dies by Map Pack visibility, HTTPS isn't optional.
6. Canonical tags - which version of a page is the real one
What it is: A small line of HTML in the <head> of each page that says "this is the official URL for this content." It's how you tell Google which version to rank when the same or similar content exists at multiple URLs.
Free tool: Screaming Frog SEO Spider (free for up to 500 URLs, plenty for most small-business sites). It crawls your site and reports the canonical tag on every page, plus flags pages where the canonical points somewhere unexpected.
Most common SMB failure: E-commerce sites where every filter combination (?color=blue&size=medium) creates a new URL with no canonical pointing back to the main product page. Google sees ten URLs that all show essentially the same product, can't decide which is the original, and ranks none of them well. The other common failure: pages where the canonical accidentally points to the homepage instead of itself, which tells Google "don't bother indexing me, just rank the homepage."
The fix: Most CMSes set canonical tags automatically and correctly. The fix is mostly about confirming nothing is broken. Run Screaming Frog (or use Search Console's Page Indexing report, which flags "Duplicate, Google chose different canonical than user" issues). If you have lots of filtered URLs, work with your developer or platform support to point those canonicals back to the main product or category page.
7. Structured data - the labels that help Google understand what your page IS
What it is: Code in your page's HTML (usually JSON-LD format) that labels what the page is about in a way Google can read precisely. Not just "this page has the word doctor" but "this page is about Dr. Sarah Patel, a board-certified orthopedic surgeon, at this address, with these office hours, who accepts these insurance plans." It's also what makes your business eligible for rich results - the star ratings, FAQ dropdowns, recipe cards, and event listings you see in Google search results.
Free tool: Google's Rich Results Test. Paste your URL, get a list of every structured-data block on the page, see whether each one is valid, and see which rich-result types it qualifies for. Schema.org's Schema Markup Validator is the more technical companion tool.
Most common SMB failure: No structured data at all. Roughly half the small-business sites we audit have nothing - no LocalBusiness schema, no Product schema, no FAQ schema, nothing. The other common failure: incomplete LocalBusiness blocks that have the business name and address but no hours, no phone, no sameAs links to social profiles, no reviews. Google can technically read it but doesn't trust it enough to use for rich results.
The fix: At minimum, every small business should have a LocalBusiness or Organization schema block on the homepage with name, address, phone, hours, website, and sameAs links to your Facebook, Instagram, LinkedIn, and Google Business Profile. If you sell things, every product page needs Product schema with price and availability. If you have an FAQ section anywhere, wrap it in FAQPage schema. The Rich Results Test will tell you exactly what's missing. walks through this in more detail with copy-paste examples.
How long this actually takes
If you're using a modern CMS and your site isn't broken, the seven checks take 30-45 minutes total. Expect to find two or three things wrong - most likely a sitemap not submitted to Search Console, a hero image that's too heavy, and missing or thin structured data. Each of those takes 20-40 minutes to fix yourself.
The two checks where small-business owners often need a developer: canonical tag problems on e-commerce filter pages, and full structured-data implementation across product templates. Everything else - the other 80% - is checkable and fixable from your laptop in an afternoon.
ClearGrade runs all 7 checks automatically
ClearGrade's audit at https://cleargradeai.com runs all 7 of these checks automatically and grades each one - plus we'll flag the ones a non-developer can fix vs the ones that need a developer. You get the list, the grades, and what to fix first. No paying an agency $2,500 a month to discover your robots.txt is wrong.
The 3-hour checklist
Print this. Tape it to your monitor. Cross items off as you go.
- [ ] robots.txt - visit
yoursite.com/robots.txt. Confirm it's not blocking your whole site. Confirm your sitemap is referenced. - [ ] sitemap.xml - submit it to Google Search Console. Wait 48 hours. Fix any URLs that failed.
- [ ] Core Web Vitals - run PageSpeed Insights on mobile. Compress hero images. Kill third-party scripts you don't need.
- [ ] Mobile-friendliness - open your site on your actual phone. Tap every link. Read every paragraph. Fix what's broken.
- [ ] HTTPS - confirm the padlock shows on every page. Turn on Let's Encrypt if you don't have it. Fix any mixed-content warnings.
- [ ] Canonical tags - run Screaming Frog or check Search Console's Page Indexing report. Fix any "Duplicate, Google chose different canonical" issues.
- [ ] Structured data - run the Rich Results Test on your homepage and one product or service page. Add LocalBusiness schema if missing.
That's it. Seven checks, three hours if everything's broken, 30 minutes if most of it works. The whole technical SEO foundation for a small business fits on one page. Most agencies will not tell you that.