How SEO Setups Increase Traffic

SEO Setups Big Traffic Increase

Congratulations you have a website. You’re one out of 1.1 billion websites in the world. The real trick is standing out, so what’s your plan?

Truth is, most businesses rely on search engines, specifically Google. Whether you know it or not the majority of your search traffic is or should be coming through organic search channels. It’s the best way to reach a larger audience and capture attention while they’re still in the awareness and research stages.

Most companies we work with mirror these top traffic sources.

SEO Setups increase organic traffic image

Being proactive and setting a proper foundation for your website can reap major rewards down the road.

Below is a list of just a few major technical hurdles our SEO setup tackles for you. Refer to the pricing page to view the full list of services provided. Again, our setup is one time updates that should last forever barring any major development changes.

Each of these items can cause drastic impacts on your traffic.

  • Proper page indexation
  • Duplicate title & description tags
  • Semantic URL structures
  • XML sitemap submission
  • Robots.txt directives
    • Even a WordPress bug that automatically de-indexes your site
More pages indexed after SEO setup image


In many cases we find that websites are not being crawled and indexed by Google. If search engines can’t crawl your site and identify what your content is about you’ll never show up in search.

Equally as bad, some sites have been manually de-indexed by Google for web spam. These penalties can be corrected once identified, but most businesses don’t know how to find this information. We’ll uncover any Google penalties for you and provide a recommended course of action.

After our SEO setup, just keep creating relevant content for your audience and the basics of on-page optimization will already be covered. Putting you ahead of the game.

Check out our case studies to see how indexation alone boosted traffic by 650%!

Duplicate Title & Description Tags

Any SEO healthcheck, scorecard or site audit will flag duplicate title or meta description tags. Popular tools that review this include Google’s Search Console (image below), Raven Tools, and Hubspot’s Market Grader Tool.

Duplicate title content is an issue for many reasons, but one being that the title tag is a major ranking signal. When you have multiple pages with the same exact title Google doesn’t know which one should be ranked first. This is essentially competing with yourself and in many cases Google chooses to ignore both and rank another website that is more clear.

Fix duplicate content with an SEO setup image

URL Structure

Like the title tag, the URL itself is a major ranking signal. It’s important to set up permalinks that keep these semantically friendly. Meaning they use actual words, rather than codes or numerical values.

An example would be a sales page for blue Addidas shoes that looks like This wouldn’t perform quite as well as a URL set to (category/brands/colors/).

XML Sitemaps

Adding an XML file that specifies all pages that should be crawled by searches is a standard practice. This file provides Google with direct links to your pages to be crawled. This step helps assure that all pages are crawled and indexed to all search engines.

You can view our XML sitemap here:


This is a similar file to the XML sitemap, but gives search engines the opposite message. This file instructs on which pages to not be crawled or indexed in organic search. This is very important for confirmation pages, content that requires an email opt-in or specific privacy pages that should not be visible for public view.
An example of our robots.txt file is below. This is a good example of how to block specific groups of pages. The last line specifies to the search bot where to find the XML sitemap file so it can then crawl the list of acceptable pages.

User-agent: *
Disallow: /wp-admin
Disallow: /success.*
Disallow: /thank-you.*
Disallow: /tag/.*

Additionally, within WordPress there is a feature setting that can automatically update the robots.txt file to block all search engines from crawling your website. This means NONE of your pages will be available in search and your traffic will suffer!

See the image below of how to find this setting in your WordPress dashboard.

SEO Setup fixes de-indexing

When the box is checked your robots.txt file will change to this:

User-agent: *
Disallow: *

Disallow:* means do not allow any search engines to crawl my website.


Whether for your personal site or your client’s sites, setting a proper SEO foundation as soon as possible will you set you up for success down the road. Once the setup is complete additional SEO and copywriting will only help increase your traffic. Ongoing work of any kind does not interfere or hinder the outcome of our initial SEO setup.

For more information or questions about this service leave a brief description in the contact form.