1. Google Analytics
2. Google Webmaster Tool
3. Opensite Explore
4. Wordstream
Then other tools like Xenu’s Link Sleuth, Robots.txt Generator, Google Snippet Preview, Pingdom Website Speed Tool will also helps.
Sitemaps are a way to tell Googles about the pages on your site. A XML sitemap, usually called Sitemap is a list of pages on your website. Creating and submitting sitemap makes sure that Google knows about all the pages in your website, including URL's that may not discoverable in normal Google...
Doorway pages are webpages created only for particular keyword phrases and only exist to capture that keyword phrase in search engine result. Its mainly created for spamdexing. This page is also known as bridge pages, gateway pages etc. Its mainly a black hat seo technique.
Robots.txt is called Robot Exclusion Protocol, which is used by website owners to give instructions like how to crawl and index pages on their website to web robots.
User-agent: *
Disallow: /
The "User-agent: *" means this section applied to all robots. The "Disallow: /" tells the robot that it...