TOP 100 SEO INTERVIEW QUESTIONS & ANSWERS
Search engine optimization or SEO is a process of keep changing the position of a web page or website in a search engine results by using keywords or phrases.
Two Types of SEO are:
- On Page Optimization
- Off Page Optimization
The SEO tools that I use are Google analytic, Keyword Planner, Alexa, open site explorer, Google Webmaster.
The SEO tools that I use are Google analytic, Keyword Planner, Alexa, open site explorer, Google Webmaster.
The outbound links are a Links, from your website to another webpage or website.
To index and update a webpage Google uses the Googlebot (Web Spider). Caching, Crawling and indexing of a webpage are done through Googlebot by collecting details from that webpage.
- Cross-linking is used to refer the process of linking one site to another site.
- It provides the users with reference sites that contain the content related to the search.
- The 2 websites cross-linking do not be owned by the same person.
- In other words, cross-linking is a barter wherein I link to you and you link to me.
- It could be a 2-way link or 3-way link. In a 2 way link site A links to site B and site B links to site A. In a 3 way link, site A links to site B, site B links to site C and site C links to site A.
Keyword is a single word, and while a combination of those keywords makes phrases. These keywords or phrases are used by the search engines to populate the subjects over the internet. Search engine stores keywords in the database, and when search is done, it will come up with the best possible match.
Whenever there is a text that does not have images on the web page is referred as body content relevance or non-image text. It helps in good optimization of the sites and also to improve your ranking in the search engine.
Spiders, robot and crawler, they are all same and referred by different names. It is a software program that follows, or Crawls different links throughout the internet, and then grabs the content from the sites and adds to the search engine indexes.
On doing search on your domain and if nothing appears then there are three possibilities.
- May be the site is banned by search engines
- May be no index by search engines
- Some canonical issues
The process of finding out new keywords from the root keyword from the search query is referred as keywords stemming. Adding a prefix, suffix, or pluralization can be used to create the new keyword.
- Google Webmaster Central
- Search Engine Land
- SEOSmarty
- MOZ
- Search Engine Journal
- BacklinkO
Cloaking is a deceptive way of optimizing your website for search. In this technique, a different content will be show to the search engine crwaler than what is presented to the end users.
There are mainly for types of Meta tags in SEO.
- Meta Description tag with 1200 pixels limits
- Meta Keyword tag
- Title Tag with 600 pixels limits
- Meta Robots
We can add 70 characters in title tag and 222 characters in Meta Description tag. Though google now places a pixel limit.
Google sandbox is an imaginary area where new websites and their search rating are put on hold until they prove worthy for ranking. In other words, it checks the standard of the website.
In order to get a high ranking in search engine result page, websites go for various methods and techniques which are characterized by two categories. One method that is acceptable by search engine guidelines is known as White Hat SEO, while the other method which is not acceptable by search engine guidelines is known as Black Hat SEO.
- Link Farming
- Hidden text, etc.
- Gateway or Doorway pages
- Cloaking
- Keyword Stuffing
Nofollow links do not pass Link juice and have no impact on Google Ranking Algorithm. Dofollow link passes link juice and has an impact on Google Ranking Algorithm.
Page rank is calculated on the basis of quality inbound links from other website or webpages to our webpage or a website.
SERP (Search Engine Result page) is the placement of the website or web-pages which is returned by the search engine after a search query or attribute.
Title tags are very essential in SEO, as it tells about the contents on that web page. Through title tags only the search engine will tell the user, what is there in the page.
Both are necessary, creating quality content is equally important to building backlinks. Although, building backlinks are useful in building authority to a site and for ranking as well, quality content is the first element that is considered to be more responsible for ranking.
SEM (Search Engine Marketing), it is used for the promotion of website through paid advertising by increasing their visibility in Search Engine Result Page (SERP) in the Ads section. While SEO is optimizing the site to increase the organic ranking of a site.
LSI stands for Latent Semantic Indexing. This technique is established to obtain the data by relating the word to its closest counterparts or to its similar context. For example, if you are searching something with a keyword CAR it will show all the related things like classic cars, car auctions, Bentley car, car race etc.
o check whether your SEO campaign is working or not, the first approach is to check the websites statistics, which tells you about the origin of traffic. The other way of checking is to make a search based on the relevant keywords and key phrases and look for the search result. The number of search result will tell you whether your SEO campaign is working or not.
Competitive analysis does the comparison, between the website I am optimizing, and the website that is ranked highly in search results.
My first attempt would to try analysis the problem and resolve them step by step
- Firstly I would try to see whether it is a new project, and then like to re-check the key words.
- Also, I would look for relevant key-words that can be helpful.
- Even though the webpage and website has been indexed well and still not appearing on the first 10 pages of search engine result page, then I would make some changes in page text, titles and description.
- If website is not indexed well or dropped from the index, than it might comprises serious issues and re-work might be required.
Robots.txt is a text file. It is through this file, it gives instruction to search engine crawlers about indexing and caching of a webpage, file of a website or directory, domain.
PPC stands for Pay Per Click and is an advertisement campaign hosted by Google. It is segmented into two modules CPC ( Cost per click) and CPM ( Cost per thousand impressions) through flat rate and bidding respectively. In CPC, if the user clicks on the advert, only then the advertiser will be charged.
It is a method by which the user is redirected to new page url to old page url . It is a permanent redirect and it is also useful in directing link juice to new url from old url .
Webmaster tool is a service provided by Google from where you can get backlink information, crawl errors, search queries, Indexing data, CTR etc.
From SEO point of view, keyword density will definitely help to stand out your content from others. The formula to know the keyword density is ( Total number of keyword/ total number of words in your article) multiply by 100.
The first step would be to update the previous site with a permanent redirect to new page for all the pages. After that, I will remove the previous content from search engine in order to avoid duplicate content issues.
From SEO point of view, for dynamic website, special additional SEO stuffs have to be implemented.
- Good Internal link structure
- Generation of dynamic title and description
- Dynamic XML sitemap generation
The latest updates in SEO are:
- Panda
- Penguin
Panda is to improve the search in Google. The latest version has focused on quality content, proper design, proper speed, proper use of images and many more.
Through Backlink Quality Checker you can know who links to your website. Now, you have to go to ? Toxic link report, where you will find all the links, that are harmful to your websites. If there is any link in Toxic link report that matches with the link on your website, then you can remove it by using Google Disavov tool.
To prevent someone from building or re-directing a low-quality links to your site, you can use tools like,
- Ahrefs
- Open Site Explorer
On regular intervals. You can either request the webmaster to remove the bad link or disavow them.
Backlink tools runs a series of tests to tell you how many backlinks are pointing to the weblink you entered. Additional information is also being collected like anchor text used, Domain Authority & Trust of the backlink source, and any potential flags or warnings for each individual link.
A link audit may be tedious and complicated process. If you have just started on building links you can do audit quite often. But a complete link audit should be done approximately once a year.
A Frame in HTML is a technique that divides the content of a page onto several parts. Search engines consider Frames as completely different pages and may have a negative impact on SEO. We should avoid the usage of Frames and use basic HTML instead.
If the SEO method doesn’t work then do the following,
- First see whether it is a new project then re-check the keywords
- Then look for relevant keywords that can be helpful
- Make changes in page text, title and description
- If still not ranked then there may be some other serious issues like bad links, penguin/panda or other Google penalty, crawlability issues, UI issue etc.
If your website is banned by the search engines for black hat practices, you can apply for re-inclusion after correcting your wrong doings.
The most important area to include your keywords are,
- Page title
- Body text
- Meta Description
Google Algorithm Updates Questions
Google makes changes to the search algorithm to improve the quality of results. Sometimes Google implements some minor tweaks and sometimes some major updates like Penguin, Panda, Humming Bird and Payday loan update etc.
The best source to see all updates ordered by date.
Mobile Friendly update was released on 21st April 2015. The major target of the update was to make the websites mobile friendly. The site which didn’t follow the guidelines of Mobile SEO were affected with the update.
According to industry sources, Google updates their search algorithm 500-600 times yearly. These updates are classified as minor updates and major updates. Most of the time we get information about the major updates, which has a very significant impact on SERPs.
Panda update was first released in Feb 2011 and a major purpose of this update is to penalized the website with low-quality content or duplicate or thin content which was created only for SEO purpose. In May 2014 Panda 4 was release which has hit many major brand sites like eBay, ask.com, biography.com and many other sites.
Penguin update was first released in April 2012. This update targeted the websites using black hat SEO techniques and using over optimization by violating the search guidelines. Later releases of Penguin update has targeted the websites generating links from low-quality sources with keyword reach anchor text.
This update was released in August 2013. The purpose of this update was to understand the intent of the user in the query and provide the results best suitable for a user. Instead of ranking a page because of keyword density, this update understands the semantics of query, relevancy of content and provide the suitable results.
For example, if you search for keyword: which is the best business hotel in Hyderabad ? then hummingbird will understand the intent of the user and provide the results for best business hotel in Hyderabad by ignoring the words [which is] and [?] to provide relevant results.
This update was released by Google in Sep 2012 to target low-quality websites which used exact match domain name sites. The major sites which hit with the update the long tail domains with keywords. Sites like www.buy-best-laptops-online.com were hit with the update. Even some good sites were hit with this update. Example www.pooltables.com has lost the 1st position after EMD Update.
If we use a keyword in the domain along with other words or brand name, we must ensure that we are providing quality of content on the site which is protected from EMD filter.
Google have released this algorithm to provide more relevant results in Google Local Results. The algorithm considers other traditional web search signals as crucial factors for local ranking. This algorithm name was coined by Search Engine
Writing an essay write an essay for me for me isn’t just enjoy the writing that many individuals are considering, you can start by it knowing that you’re for writing your own essay ideal. Do not keep yourself under the person’s thumb. You are different and then you have to be 15, if you wish to be unique. Writing for me is a challenging job but if you believe it isn’t hard then you’re wrong because any writing job can be a struggle and it’s always a good experience to go through a job.
Land?.
This update was released to target the websites which distribute the pirated material like pirated movies, software and other copyright material. Most of the torrent sites, online movie distribution sites were hit with this update.
Rank Brain is an artificial intelligence (AI) software program used to help process search queries. Google announced about this update in October 2015. RankBrain uses artificial intelligence to process large amount to data and convert the data in a format which can be interpreted by Google Machine learning systems.
Google has made the Penguin algorithm integrated with their core algorithm and it works in real time. This announcement was made in Sep 2016 by Google. Also, this update was planned to released in Phases, So in September end and October beginning, Google launched 2 phases of this update.
On page Optimization Questions
Keyword research is useful for understanding the keyword search volumes and predict the user demand for a product or service.
In general, keywords can be classified into 3 types. Information, Navigational & Transactional.
Informational: The intention of the user is to get knowledge about a specific topic/service. Ex: Who are founders of Apple
Navigational: Queries that seek an information about specific branded site or website. Ex: Apple Mac Support
Transactional: The user intention is to find a product/service and it has transactional nature. Ex: Buy iPhone Online
We have many tools for keyword research, the most popular are Google Auto-suggestion, Google Keyword Planner, Uber Suggest, Bing Keyword Tool, Keyword.io and WordStream keyword tool.
From August 2016 Google starting restricting the keyword data to non-advertisers account. Only advertisers with active campaigns can able to now access the data related to monthly volume. Non-advertisers will see only data in the form of range.
I use Google Trends tool to analyze the trending topics in Google..visit www.google.com/trends/
- Unique title with relevant keyword
- Search engine friendly URLs
- Proper meta description tag
- Using heading in proper order
- High-quality content with keyword density
- Structured data in page
- Social Sharing
- Internal linking
- Latent semantic keywords
- Schema integration
55-60 characters is the optimal length of the title.
The Meta description is the two line summary displayed in SERP. Generally, the description is 155 chars, but more characters the automatically truncated by Google. The keywords in meta description are used in ranking purpose [Google updated this in Sep 2009 that it do not use meta description for ranking] but proper meta description will improve the Click through rate (CTR) for the page.
<meta name= description content= Write the brief summary of the page and explain precisely about the page />
No, this tag is not used by any search engines for ranking purpose. In 2008 Google announced this tag is used in their ranking factors. But as per analysis, we can see that most of the popular websites still use this tag. So it is better to use this tag with relevant keywords.
The schema is a markup code which provides more useful information to users on SERPs. This project is supported by Google, Yahoo, Bing and Yandex. This is also called as Structured Data or Rich Snippets.
The most commonly used schema codes are as follows
- Video Schema
- Aggregate Reviews
- Person information
- Events
- Single review
Link Building (Offpage) Optimization
Links are one of the crucial elements of SEO. With high-quality links, we can improve the ranking of the site.
Yes, I use many specific rules to define the quality of links. I will look at following factors before submissions.
- Quality of Backlink (PR, PA, DA must be good)
- Relevancy of the content between 2 sites
- Proper use of Anchor Text
- Avoid site-wide links
- Avoid links from low-quality sites
Any link which has the attribute rel= Nofollow is called as No Follow backlink. NoFollow backlinks do not pass any PR value, but still helpful for link diversity.
Generally all public websites like social media sites, business listing sites etc offer NoFollow backlink. Example: Fb, YouTube, JustDial etc site offer these links.