Search Engine Optimization (SEO)
If you own or manage a website, you need to know at least a little SEO. This guide will help you learn the basics so you can help your audience find the amazing content you’ve been working on.
What Is SEO?
Search engine optimization, or SEO, is the process of helping search engines understand the content on your website. If you optimize your site correctly, you’ll see more traffic from the actual audience that you’re trying to reach.
Many SEO disciplines focus on different aspects of your site. Before you get started, make sure you understand the basic terminology related to this important marketing field.
White Hat or Black Hat?
Not all tactics used to improve search engine rankings are ethical. Black hat SEO is a term used for SEO tactics that attempt to game or break the system. This includes things like keyword stuffing, link spam, and anything else that might trick Google into giving your site an undeserved ranking.
White hat SEO refers to SEO tactics that play by the rules. Google is constantly trying to thwart black hat users, so it’s in your best interest to use white hat techniques.
Local SEO refers to tactics that improve your local search engine rankings. This means that if someone in your area searches for “restaurants near me,” they’ll find your coffee shop or diner.
Great local SEO uses the same principles as normal search engine optimization with a focus on local relevance. Make sure that your site has relevant contact information, claim your business listings, and target keywords that include your city or zip code.
SEM, PPC, and Google Ads
Search Engine Marketing, or SEM, refers to any paid marketing tactics that use search engine results. This includes Google Ads and other forms of pay-per-click advertising. SEO and SEM are closely intertwined, and SEM tactics work best when your site meets current SEO standards.
How SEO Works
The field of SEO is designed to help both users and search engines understand and navigate your site. SEO professionals learn how search engines index information and then apply that knowledge to the way your site is structured.
A crawler is a tool that search engines like Google use to understand the internet. Crawlers are internet bots that visit websites, explore available links, and index the information. This index is then used to answer the queries typed into the search engine.
Technical SEO is the practice of making your site as accessible to crawlers as possible. Because crawlers are designed to think like users, having good technical SEO will usually result in a website that’s easy for your visitors to navigate and understand.
On-page SEO refers to the optimization of elements on your actual site. This includes filling out metadata, creating good content, and changing your layout to create a better user experience. Because you always have control of your website, on-page SEO tactics are generally the easiest to implement and maintain.
Off-page SEO impacts the way your site is seen by the rest of the internet. Great off-page SEO involves building links and creating a digital presence for your business on social media. When on-page and off-page SEO work together, search engines see your site as more relevant and improve your overall ranking as a result.
SEO Step One: Crawl Accessibility
Google’s crawlers are surprisingly smart, but they still need help understanding your site. Follow webmaster guidelines and technical SEO best practices to make sure the crawlers can find their way around.
When a crawler visits your site, it starts at the home page. It then explores your site by clicking on links from one page to the next. If a page isn’t linked, it won’t be visited.
This crawling pattern mirrors the way that a user might explore your site. A large part of SEO is creating a path that both the crawler and your users can follow to reach every piece of content. Interlinking generally involves three standard practices:
- Using an intelligent page hierarchy to make navigation simple and transparent.
- Including a sitemap to help users find their way around.
- Linking to relevant pages at natural points in your site content.
All of your pages should be linked to somewhere in your site’s navigation. However, you can also help crawlers understand your pages by linking to them from other relevant places in your site’s content. An example of this might be linking from one blog post to another post on a similar topic.
ll of your pages should be linked to somewhere in your site’s navigation. However, you can also help crawlers understand your pages by linking to them from other relevant places in your site’s content. An example of this might be linking from one blog post to another post on a similar topic.
If you interlink your pages, make sure that the anchor text is descriptive and useful to both the crawler and the user. If you link to your “Contact” page, the anchor text should say something like “contact us” or “get in touch.”
It’s also essential to limit the number of links on each page. Crawlers will only explore around 150 links on any given page, although this number can be increased for pages where it’s absolutely necessary. Since your users don’t want to click through a ton of links, it’s usually smart to save your link-heavy pages for things like blog archives or staff directories.
Your page hierarchy refers to the way pages nested within your website’s directory. Your home page is at the very top level. The rest of your pages are then organized in sublevels based on their relevance to each other.
When a crawler visits your site, it indexes the top-level pages first. If there are any nested pages, the bot will come back and crawl them on a later visit.
From an SEO standpoint, this means you should place the most important pages on the highest level of the hierarchy. Your nested pages should then be placed in the categories that make the most sense. As an example, a plumbing site might have a “Services” page on the top level, and an “Air Conditioning” page nested under “Services.”
Creating a Sitemap
A sitemap is a condensed version of your entire site’s navigation. Both crawlers and users will frequently reference your sitemap, and you can even submit your sitemap to Google as an invitation for indexing.
If you’ve been using a clear page hierarchy, generating a sitemap should be easy. Google can read sitemaps in XML, RSS, and TXT formats. Google also recommends that you post your sitemap under the root directory to make it as accessible as possible; this might look like www.SITENAME.com/sitemap.txt.
A robots.txt file is a text document that tells Googlebot and other web crawlers how to access your site. By saying which pages are important and which are safe to ignore, you can speed up the crawling process and greatly improve your site’s search engine compatibility.
If you want the bots to find your robots.txt, you need to place it in your site’s main directory. You can find this file on any site by typing in www.SITENAME.com/robots.txt.
According to Google, most sites don’t need a robots.txt. This file is particularly useful if your site generates new pages for search results, links to a lot of advertisements, or has a large amount of content that you don’t want to be indexed.
Your Crawl Budget
Google’s bots need time to crawl all of the pages on the internet. Your crawl budget represents the average number of pages that Googlebot will crawl on any given day. If you’ve noticed that your site isn’t getting indexed promptly, you might need to optimize your crawl budget.
Crawl budget problems typically occur when your site has too many 404 or 301 redirects. If you clean up the bad links on your site, Googlebot won’t waste any more time on them. You can also improve your budget by disallowing unimportant sections with your robots.txt.
Google and Bing Webmaster Guidelines
The crawling tips listed here are part of Google’s webmaster guidelines. Following these guidelines is easy – just make sure that all of your content is immediately accessible to the average user, and allow crawling for any site elements that might be directly relevant to the user experience.
Bing’s webmaster guidelines are very similar to the ones laid out by Google. The main difference is that Bing has a larger focus on targeted keywords, while Google’s algorithms are better at understanding user intent. For basic SEO purposes, assume that any site with a clean directory and well-titled pages can be crawled by both Bing and Google’s search engine bots.
SEO Step Two: Completing Metadata
Metadata is information about your site that’s hidden within the HTML. Crawlers use metadata to understand what’s on the page without actually reading it. Some metadata is also visible to the user, so it’s important to write it with readers in mind.
If you’re using a CMS, completing your metadata should be easy; just fill out every prompted field when you create a new page or a blog post. If you’re creating a site with plain HTML, you’ll include the metadata in the header section.
A meta title is a text that appears at the top of your browser window when you view a page. This is possibly the most important piece of metadata, and you should make sure to include a unique meta title for every page on your site.
Good meta titles should explain the content on the page using relevant keywords. Don’t include any unnecessary information, as this will just confuse the crawler. A good meta title for your “Contact” page might be “Contact Company Name.”
Meta titles and meta descriptions both appear in search engine results. Your meta description is the snippet of information that appears beneath your meta title. Crawlers will use the meta description to find keywords, and users will read it before deciding whether to click on your page.
Meta descriptions should be between 50 and 160 characters; anything else will get cut off in Google’s search results. Write something short but interesting, and make sure to include any relevant keywords. If you fail to specify a meta description, users will see a sample of the content on that page.
Meta keywords used to be the core of SEO. However, because they were so easy to abuse, these tags became a favorite tool of the black-hat SEO professional. Today, most search engines don’t treat meta keywords with the same level of relevance.
This means that you don’t need to include meta keywords in your site’s data. You also don’t need to remove them if they’ve been automatically generated by your content management system or one of your plugins.
Alternative text is used for images, videos, and other media files. Crawlers don’t have eyes, so they can’t understand images on their own. Instead, they refer to the alt text to see what the image is all about.
Alt text is important for universal accessibility. Some people use web browsers that don’t load images, and the alt text prevents them from missing information. Alt-text is also helpful if the images on your site fail to load due to a bad internet connection.
Get into the habit of including alt text for every media file that you upload on your site. Your CMS should prompt you to do this automatically; just make sure not to ignore the field. Use clear and concise descriptions and include any relevant keywords as necessary.
Much like the robots.txt file, a robots meta tag is used to tell crawlers how to treat a specific page or link. These tags are particularly useful if you want to remove a page from the index without bothering to upload an entire robots.txt file, but they aren’t necessary for beginning SEO.
SEO Step Three: Content Development
Most SEO tactics are designed to help search engines and users find your site. However, none of those methods will improve your rankings unless your site contains the content that users are looking for. That’s why an important part of modern SEO is filling your site with useful, interesting, and compelling content.
Conducting Keyword Research
Meta keywords might be out of style, but that doesn’t mean keywords are completely irrelevant. If a user searches for “cats,” a page that talks a lot about cats is going to rank higher in the search engine results than a page about dogs.
Keyword research is a reverse engineering process that uses popular searches to decide what your pages should be about. Using the above example, you might decide whether to write a post about cat training or dog training based on which keyword gets the most hits.
Keyword research is usually accomplished using an SEO tool like Google Analytics. Type in a keyword and see how many people have searched for that word in the last month or year. Then, click on related keywords to see what else people are interested in. Use this information to guide your content marketing strategy.
Understanding Search Intent
In the old days of SEO, spamming a keyword was enough to make your page rank for that topic. You could say the word “cats” 50 times in an article about dogs and have a pretty good chance of ranking.
Nowadays, Google cares more about the search intent of the user. If you want to read about cats, you’re only going to get an article about cats – even if the dog article had more instances of the keyword that you typed in.
As a website owner, understanding the intent of your users can help you make better content. Are they looking for information, or do they want to buy a product? Think about why someone would type in the keyword, and help the user get what they’re looking for.
What Makes Good Content?
Google rewards sites with high-quality content. Creating great content is the quest of every digital marketer. As you start writing, keep these three traits in mind:
- Relevant: Content needs to be about the keywords that it ranks for. Stay focused on the main topic, and place tangents in their own articles.
- Useful: Your content should answer the question that caused the user to search in the first place. The more information you provide, the better you will rank.
- Interesting: Content that gets shared ranks higher, and people share content that they enjoyed. You’re on the right track if you create content that you would want to read.
Basic Link Building
One of the fundamental concepts of off-page SEO is that a site that has been linked to by other sites will get a higher ranking. Link building is the practice of getting as many legitimate and reputable links to your site as possible.
Link building should be one of the last steps of your SEO process. Once your site is full of excellent content, create connections using a few of the following methods:
- Publish guest posts. If you publish a guest post on another site, they will link to your website in your author’s credit. You can also link to your own relevant blog posts throughout the article.
- Join directories. Some sites host directories of sites about a specific topic. Joining relevant and reputable directories will help users find you and tell Google that your site is more important.
- Link to other sites. If you want sites to link to you, try linking to them. Trade guest posts, link to your favorite articles and watch your network naturally grow.
- Create sharable content. Social media shares count as links, although they aren’t quite as valuable as links from static pages. Content that gets a lot of shares will automatically go up in search results.
SEO is complicated, but you don’t need to handle it all on your own. The internet is full of both free and paid tools that will help you understand your website and determine whether your efforts are actually working.
Google Search Console
Google Search Console is a tool that lets you see your site from Google’s point of view. The console will tell you which pages have been indexed, help you find and correct problems, and track where your site traffic is coming from.
Where Google Search Console is focused entirely on search results, Google Analytics is focused on what users actually do on your site. Find out which pages they’re visiting, how long they stick around, and what links they clicked to get there.
Ahrefs is an all-in-one SEO tool that will help you audit your site and improve the content. Useful features like a keyword explorer and a rank tracker will assist you as you grow your site over time.
SEMRush is a tool focused on search engine marketing. Research keywords, conduct a link analysis and find out how you can increase your traffic. SEMRush also lets you analyze your competitor’s online advertising strategy.
Google’s Keyword Planner is a free keyword research tool that works with Google Ads. You can view keyword statistics, get relevant keyword suggestions, and even see how much Google thinks you should bid on particular keywords to get the most traffic.
Ubersuggest is a keyword tool from famous SEO professional Neil Patel. You can analyze either a keyword or a domain. The tool will show you traffic information, suggest content ideas, and even tell you how many backlinks appear in pages that are most relevant to that keyword.
Google Algorithm Updates
Whenever Google’s search engine algorithm is updated, it completely changes the SEO game. Understanding the history of Google’s updates can help you figure out what the search engine giant is looking for in a quality site.
- Panda – In 2011, the Panda update started assigning content quality scores to websites. This is why the internet now focuses on strong, relevant, and interesting content instead of spammy content stuffed with keywords.
- Penguin – The 2012 Penguin update sought to defeat manipulative link building tactics. Links now only improve your ranking if they’re from reputable and relevant sites.
- Pirate – The 2012 Pirate update made it harder for sites that provide copyrighted content to appear in search results.
- Hummingbird – In 2013, the Hummingbird update improved Google’s ability to understand search intent. The focus stopped being about keywords and started being about what users were actually looking for.
- Pigeon – The Pigeon update came out in 2014 and brought standard SEO criteria to local SEO.
- Mobile-Friendly Update – In 2015, the update nicknamed Mobilegeddon made it so that mobile-friendly sites are more likely to show up in mobile search results.
- RankBrain – This 2015 update improved Google’s machine learning capabilities and helped the search engine understand user intent.
- Possum – The 2016 Possum update allows Google to change results based on the location of the user. This impacts retailers the most, but it’s relevant to anyone worried about local SEO.
- Fred – The mysterious 2017 known only as “Fred” doesn’t have a publicly announced purpose, but it seems to have reduced rankings for sites with low-quality or ad-focused content.
- Core Updates – In 2019, Google released several updates to its algorithm that seem to be related to site reputation. These updates have cemented the importance of trustworthy backlinks and relevant content.
No matter how often Google updates or what kind of site you want to run, the core principles of SEO remain the same. As long as you create a user-friendly site with simple navigation and interesting content, Google will make sure that the right people can find you. Keep updating your content, and your rank will continue to improve.