February 20, 2017 sudheerrai no responses

Search Engine Optimization Questions and Answers Part-II

31. Why Google penalize a website?

Ans.  Below are the most common reasons:

a. Duplicate Content.

b. Keyword Stuffed Content.

c. Hidden Links.

d. Hidden Content.

e. Rented Links.

f. Black Hat SEO Activities.

g. Link to Suspicious Sites.

h. Over Optimizatio

i. Malicious Back Links.


32. What is Difference between HTTP and HTTPS?

Ans. Http stands for Hypertext Transfer Protocol and HTTPS stands for Hypertext Transfer Protocol Secured.

Both http and https are the protocol technology that allows you communicate with websites. We usually see these before urls. HTTP is used for primary communication for server and client, which does not require secure connection. Where as https is used for secure sites. These sites are encrypted and can not be hacked the user information.

If a web page asks you to enter credit card information, you should look to see if the url begins with https.

Here is the fact of HTTP: The term HTTP is originated by Ted Nelson. HTTP connections uses a port 80 by default. HTTP URLs begin with http:// The first version of HTTP was introduced in 1991 that is HTTP V0.9. HTTP V1.0 is specified in RFC 1945 that officially introduced and recognized in 1996. HTTP V1.1 is specified in RFC 2616, and was released in January 1997. HTTP V2.0 is specified in RFC 7540 and was published in may 2015.

Here is the fact of HTTPS: HTTPS uses a port 443 by default to transfer the information. HTTPS URLs begin with https:// The HTTPS is first used in HTTPS V1.1 and defined in RFC 2616.


33. What are Webmaster Tools?

Ans. Webmaster tool is a service provided by Google from where you can get back link information, crawl errors, search queries, Indexing data, CTR, etc.


34. What is Keyword density and what is the formula for knowing Keyword density?

Ans. From seo point of view, keyword density will definitely help to stand out your content from others. The formula to know the keyword density is (Total number of Keyword/Total number of words in your article) multiply by 100.


35. What is HTTP 404 Not Found Error?

Ans. The HTTP 404 Not Found Error means that the web page you were trying to reach could not be found on the server. It is client side error which means that either the page has been removed or moved and the URL was not changed accordingly, or that you typed in the URL incorrectly.


36. What is SEO friendly URL structures?

Ans. URLs describe a site or page to visitors and search engines. Keeping them relevant, compelling and accurate is the key to ranking well.

The URL of a web document should ideally be as descriptive and brief as possible. If, for example, a site’s structure has several levels of files and navigation, the URL should reflect this with folders and sub folders. Individual pageURls should also be descriptive without being overly lengthy, so that a visitor who sees only the URl could have a good idea of what to expect on the page.


37. What is robots.txt?

Ans. Robots.txt is common name of a text file that is uploaded to a Website’s root directory and linked in the html code of the website. The robots.txt file is used to provide instructions about the website to Web Roots and Spiders. Web authors can use robots.txt to keep cooperating web roots fro accessing all or parts of a website that you want to keep private.


38. Can You Optimize the website which has pages in millions?

Ans. From SEO point of view, for dynamic website, special additional SEO stuffs have to be implemented.

a. Good internal link structure.

b. Generation of dynamic title and description.

c. Dynamic XML sitemap generation.


39. What is the latest Updates in SEO?

Ans.  The latest updates in SEO are:

a. Penguin 4.0

b. Hummingbird

c. Panda


40. What are the key aspects of Panda update?

Ans. Panda is to improve the search in Google. The latest version has focused on quality content, proper design, proper speed, proper use of images and many more.


41. What are the key aspects of Penguine update?

Ans. Penguine is the code name for Google algorithm. It’s main target is to decrease the ranking of that website that are violating the Google Webmaster guidlines. These Guidelines are violated by using black hat techniques like cloaking and stuffing.


42. How will you neutralize a toxic link to your site?

Ans. Through Back link Quality Checcker you can know who links to your website. Now, you have to go to toxic link report, where you will find all the links, that are harmful to your websites. If there is any link in Toxic link report that matches with the link on your website, then you can remove it by using Google disavov tool.


43. What is Search Engine?

Ans. A search engine is a web based tool that enables users to locate information on the World Wide Web. Some major commonly used search engine:

a. Google

b. Yahoo

c. Bing

d. Yandex


44. Tell me something about Google?

Ans. Google is the worlds largest and renowned search engine incorporating about 66.8% of market share. It was introduced in 1998 by students os Stanford University Students Sergey Brin and Larry Page. The unique algorithm raanking system is considered as its key of success. Besides Google Mail service there are various worthy and useful tools are being offered absolutely free which include Blogger, Feedburner, Youtube, Google Plus, Adsense, Webmaster Tools, Adwords, Analytics and Many More.


45. What are the different techniques used in OFF Page SEO?

Ans. There are lots of techniques used in OFF-Page SEO work. Major Techniques are:

a. Directory Submission.

b. Social Bookmarking.

c. Blog Post.

d. Article Post.

f. Press Release Submission.

g. Forum Posting.

h. Yahoo Answer.

i. Blog Comment.

j. Deep Link Directory Submission.

k. Regional Directory Submission and all that.


46. Define Blog, Article and Press Release?

Ans. A blog is referred as an information or discussion published on website or World Wide Web incorporating distinct entries called as posts. Basically, the blog is referred as everthing where you can include others too. It is more individual in contrast to article and Press Release. It is also considered as very personal in subject to both style and comprised ideas and information and can be written in the way just like you may talk to your readers. It is also called Web Diary or Online Diary.

The articles are concerned with a specific topic or event and are highly oriented towards an opinion instead of information. An article is supposed to be more oriented towards showing up opinions, views and idea. Generally, it is written by a third party or expert of any specific filed.

Press Release is related to a specific action or event which can be republished by distinct medium of mass-media including other websites. It should be simple, short and professional. It conveys a clear message or information.


47. What are Meta Tags?

Ans. HTML Meta Tags are usually referred as tags of page data which sits between opening and closing head tags of a documents HTML code. Actually these are hidden keywords who sits in the code. These are invisible to visitors but are visible and readable by Search Engine.



<title>Not considered meta tags, even required anyway</title>

<meta name=”description” content=”write your description here”/>

<meta name=”keywords” content=”write your keywords here”/>



48. The difference between keywords and keyword phrases?

Ans. The keyword term is basically concerned with a one-word term, on the other hand a keyword phrase considered as employment two or more words combination. Therefore, this practice is not encouraged to employ. In order to drive more traffic and top ranking in SERP it is recommended to employ keyword phrase.


49. What is Spider?

Ans. Spider also called as bot, crawler or robot is a set of computer program that browses the World Wide Web in methodical and orderly fashion as well automatically scan the webpages and website for updated content and download a copy to its data center to Index.


50. Name the bots(Spider) of Major Search Engine?

Ans. The name of bots/spider of Google Search Engine is Google Bot, yahoo Slurp for Yahoo Search and BingBot for Bing Search Engine.


51 Define Page Rank?

Ans. Page Rank is a set of algorithm for link analysis named after Larry Page and employed by Google Search Engine towards defining a numerical value from 1 to 10 to each component of hyperlinked documents like the world wide web. The value accepts only round figure that means decimal are not allowed. Page rank is calculated by their inbound links.


52. What is Cache?

Ans. Cache is the process performed by search engine crawler at a regular interval of time. It is used to scan and take snapshot of each page over world wide web as well as store as a backup copy. Almost every search engine result page incorporates a cached link for every site. However, clicking over cached link show you the last Google cached version of that specific page rather than of current version.


53. Define ALT Tag?

Ans. The ALT attribute also called as ALT tag are employed in XHTML and HTML documents in context of defining alternatives text that is supposed to be rendered when the element can’t be rendered to which it is applied. One great feature of ALT Tag is that it is readable to screen reader which is a software by means of which a blind person can hear this. In addition, it delivers alternative information for an image due to some specific reason a user can’t view it., such as in case of a slow connection and an error occurred in the source attribute.


54. What do you know about RSS?

Ans. RSS stands for Really Simple Syndication is useful to frequently publish all updated works including news headlines, blog entries etc. This RSS document also known as web feed, feed or channel that incorporate summarized text including metadata i.e. authorship and publishing dates etc.

However, RSS feeds makes the publishers flexible by syndicating the content automatically. There is a standardized file format xml that lets the information to be visible to several distinct programs. Also, this makes readers more ease to get updates timely by allowing them to subscribe from their favorite sites.


55. How would you define Alexa?

Ans. Alexa is a California based subsidiary company of which is widely known for its website and toolbar. This Alexa toolbar congregates browsing behavior data and send it to website, where the data is analyzed and stored and create reports for company’s web traffic. Also, Alexa provides data concerned with traffic, global ranking and other additional information for a website.


56. How can you achieve Google Page Rank?

Ans. Generally, Google Page Rank is based on inbound links, therefore, the more backlinks you congregate higher your page rank will be. Also, it is influenced by rank of a page which is linked to you. One other thing to consider is that the older your website will be, It will be more favorable and trusted by Google. Google rewards those websites who incorporate lots of pages, tons of incoming link and also healthy quantity of internal links to other pages within the sit. In respect of SEO projects, it is relatively not so significant but delivers a picture about work to perform towards earning inbound links.


57. Why the Title Tag in Website is valuable?

Ans. In our SEO efforts Title Tags are very earnest. It is highly recommended to include a Unique Title that exactly says about the contents sits on that page. It is valuable because this is a thing which appears in the search engine results section and tells the users and search engine, what is on the page.


58. What is Site Map and distinguish between HTML sitemap and xml sitemap?

Ans. A sitemap Incorporate list of webpages which are accessible to users or crawlers. It might be a document in any form employed as a tool for planning either a webpage or web design that enables them to appear on a website as well as typically placed in a hierarchical style. This helps search engine bots and users to find out the pages on  website. The sitemap renders our website more search engine friendly as well as enhances the probability for frequent indexing.

HTML sitemap can be incorporated directly in a webpage for users flexibility and can be implemented through proper design. On the other hand, XML sitemap is useful only for search engine crawlers or spiders and does not visible to users. It sits in the root of the website.


59. What is the significance of Robots.txt file in a Website?

Ans. Robots.txt file is considered as a useful convention to prevent cooperating web roots and web crawlers from accessing all or part of a website or its content for which we don’t want to be crawled and indexed but publicly view able. It is also employed by search engine to archive and categorize website and to generate a rule of no follow regarding some particular areas of our websites.


60. How Keywords are implemented for optimization?

Ans. The best way to opt and implement keywords is to designate those keywords which are popular, relevant, to our content, comprises high search volume and effective. Stuffing and other employment of keywords must be avoided. In order to get best result and effect, our pages should not contain a keyword density more than 3-4%. Including Keywords in the Title and Description is highly recommended.

Share it!
Aenean mattis venenatis
Sudheer Rai