The First Step- Knowing what is going on behind the scenesThe all-knowing Google can be complex at times, but grasping the general idea of how pages are indexed and ranked isn’t hard. Google is always crawling the web to discover newly created pages or finding new changes to old pages. The thing is, Google has to tell what exactly that page is about and if it’s worth indexing highly for certain searches.
What does Google look at?“Googlebot” first discovers your web page through links to that page from other places. It then processes the text contained in each page and deciphers the intent of that text along with what important keywords it contains. From there, it analyzes image alt text, title tags, links attached to that page, and a huge number of other SEO things. But if your brand new website has no links to it, it may not have been found yet so do a quick check to see if Google’s indexed it by tying this into Google: “site:yourwebsitename.com”. You should see that Google has either found it or has not yet found it like below:
The Second Step- Create a Robots.txt fileThe Robots.txt file is nothing more than a simple text file that should exist in the root directory of your website’s domain. First of all, this isn’t 100% mandatory to have but it is generally part of SEO best practices for creating new websites and helps Google crunch websites with a huge amount of pages. Not only this, but the robots.txt file can prevent massive headaches when it comes to dealing with duplicate content. Here’s how it works: This is the first file that web crawlers like Googlebot look at and take instruction from. This file tells them which pages are okay to look at and which should not be indexed at all. This is specifically useful for websites that have the same content on two or more pages. As you should already know, having this duplicate content can very quickly kill a website’s rankings. If you’ve got a good website host that offers cPanel, just check your file manager for the file and if it doesn’t exist, create it. But make sure you use Notepad rather than Microsoft Word or some other more robust text editor. We recommend you be careful with edits to your robots.txt file, and double check your creation with this robots.txt tool from Google.
The Third Step- Get your website a good sitemapAs defined by the internet, a sitemap is-
Having a sitemap is important to getting your website crawled quickly as it reduces the average crawl time by an average of about 20 hours. This is especially important for certain kinds of content that are constantly updated. For those of you on WordPress, all you need to do is head over and install the highest rated plugin for over 9 years strong. It auto-creates sitemaps for you. For everyone else (and those wanting a hands-on approach), log into your Google Search Console and follow these steps. Simply follow instructions here and add the HTML file to your new website, come back, and claim your website. Then create your sitemap with Google’s help like this:
A sitemap is a list of pages of a web site accessible to crawlers or users. It is a document or a web page that lists the pages on a web site, typically organized in hierarchical fashion.
The Fourth Step – Link to your new website & share itLinking to your new website after you’re ready to go public from quality website is one of the most powerful and strongest indicators Google can receive. Make sure the website you choose to receive a link from is not spammy and is crawled by Google regularly. This is also an easy option if you or someone you’re close with runs a quality website and has content on their site that is relevant to what your new website is about. But you can also take this alternative route as made famous by Cyrus Shepard when he left Moz to start his own new website:
The Fifth Step – Manually submit your website URL to Google & Co.A good tool to add to your arsenal to get Google on the right track is to manually submit the URL you want indexed to Google Webmasters. While we’ve personally found this to work quite quickly and easily there are those that see this method as not so useful and choose to skip it altogether.
The Sixth Step – Add a content hub to your websiteSome call it articles, some call it featured resources, some just call it a blog. No matter the name, it’s absolutely imperative to have a place for new, unique, and valuable content to be regularly indexed on your website. As HubSpot put it,
That’s powerful stuff. But even thinking about it from a high level view it makes sense – the more pages you have in top search results the more chances you have to acquire new customers. Producing on-site content gives you this opportunity.
The average company that blogs generates 55% more website visitors, 97% more inbound links, and 434% more indexed pages.