When launched the business owner often expects a new website to be listed on Google search result pages instantly.
A new website, on a new URL, will not be listed unless we tell Google it is there. Although it could eventually be crawled if there is a link to it somewhere on the web.
To view a newly launched website type the address into the browser address bar.
To let Google know that a site has been launched the owner or webmaster must submit the URL.
How to get a website listed on Google search
Once your site is complete and launched, submit your URL to Google’s search console (originally Google Webmaster Tools).
Google does not guarantee all websites will be added to the index, but if your site is well structured, has complete meta-tags and genuine content usually it will be crawled and indexed fairly quickly. Again, Google does not set a time on this but in my experience it is possible within a week now.
Google search console
The Google search console is an online area in which webmasters are provided with data and tools to help maintain a Google friendly website, track when the site is crawled and discover indexing or security errors.
Within the search console webmasters can add websites, known as ‘properties’.
The webmaster must verify that they are authorised to submit a website, either by uploading a file to the server or adding a DNS record.
Submitting a site map in the Google search console
Within the search console webmasters can provide Google with a sitemap.
Sitemaps are a way of telling Google what is included in your site, and what should be crawled.
A sitemap is a list of web pages, posts and media. Within it you can let Google know how often items are updated so Google can return to re-crawl a site.
Submitting a sitemap to the Google search console can dramatically decrease the waiting time for a site to be listed.
Adding a site via the search console does not mean a site will rank highly, but starts the long road to getting an organic listing.
Understand how Google lists a website
The Googlebot is the search bot that Google sends out to collect information about web pages on the internet to add to Google’s searchable index. This is called crawling or fetching.
Googlebot starts with web pages captured during previous crawl processes. As it browses web pages previously crawled, it will detect new web links to add to the index.
Googlebot will add in sitemap data provided by webmasters.
Once Googlebot gets round to crawling the pages a site will start to appear on search results.
Increase exposure with inbound links to your website
This is why links from related, topical and relevant sites that are already ranked highly can help you get indexed.
Googlebot often crawls pages that are already listed, looking for new links.
Social network sites like Facebook, Twitter, LinkedIn and Pinterest can help when trying to gain your own organic listings.
Posting your own URLs will not directly help page rank, but when those URLs are shared by other influential people in the industry it is thought that Google ranks them higher, possibly because it will have an effect on click through rate.
Now read my post, getting better search results