What is SEO?
Below is a list of the most frequently asked questions about search engine optimization, along with answers, recommendations, and best practices, logically organized into chapters and subchapters. The result is one of the easiest-to-follow guides to SEO basics for beginners.
Table of Contents
SEO Basics
What are organic search results?
How is website traffic measured?
What is SEO in digital marketing?
Search Engines
What is the best search engine?
What is the most popular search engine?
Why is Google the most popular search engine?
How does Google web crawler work?
What is the meaning of web crawling in SEO?
How do I get Google to crawl my website?
How to make sure Googlebot isn’t blocked?
What is indexing in Google Search?
How long does it take for Google to index a new site?
How Google search algorithms work?
Search engine ranking factors
How many factors does Google use to rank web pages?
What search engine ranking factors are most important?
What are negative ranking factors?
SEO best practices
1.1. Improve the structure of your URLs
1.2. Make your site easier to navigate
2.1. Create unique, accurate page titles
2.2. Make use of the “description” meta tag
2.3. Offer quality content and services
2.5 Optimize your use of images
2.6. Use heading tags appropriately
3.1. Make effective use of robots.txt
3.2. Be aware of rel=”nofollow” for links
4.1. Notify Google of mobile sites
4.2. Guide mobile users accurately
5.1. Promote your website in the right ways
5.2. Make use of free webmaster tools
Conclusion
SEO Basics
What is SEO?
SEO is an abbreviation for the term search engine optimization. Search engine optimization (SEO) is what you do to increase the traffic to your website from organic search results on platforms such as Google and Bing.
What is website traffic?
Website traffic is anything that visits your website. This includes both, actual human visitors and automated programs (called bots).
What are organic search results?
Whenever a person types something into a search engine, they are presented with a page containing search results for the term they were searching for.
Each search results page contains two types of results: paid and organic.
Organic search results are listings on search engine results pages that appear because of their relevance to the searched term.
In contrast, paid search results are advertisements where an advertiser pays to have their ad displayed whenever someone runs a search query matching the advertiser’s criteria.
How is website traffic measured?
Website traffic is measured through a specialized website analytics service by adding a short piece of computer code to your webpages. This code is normally generated by the service you are using to measure website traffic.
Depending on what service you use, the code may track all kinds of things, including, for example: the number of visitors to your website, how long they stay, which pages they visit, what browsers they use, where are they located, etc.
Currently, Google Analytics is perhaps the most popular website analytics service out there. Also, it’s free.
What is SEO in digital marketing?
SEO is an integral part of any digital marketing strategy that includes driving website traffic from organic search results. SEO is what makes that part of your digital marketing strategy happen.
What is the use of SEO?
If done correctly, SEO will benefit you in four ways. It will:
- Allow your website to come up in search results for the most profitable search queries (also called keywords);
- Improve your search engine rankings – that is, the position at which a page from your website appears in the search engine results;
- Ensure that the best page from your website appears in the results for each individual search query; and
- Improve the clickthrough rate to your website – that is, the ratio of users who click on your result as opposed to all others.
Why do you need SEO?
You need SEO to increase your share of free website traffic from search engines. According to some estimates, a total of 6.5 billion searches are performed each day. Because the top-ranked search results get almost all of this traffic, you need SEO to get your website to the top of the search result pages.
Why is SEO so important?
Performing SEO is the only way for your website to rank higher in the search results without paying for advertising. SEO helps you maximize the benefits of search engines as a marketing channel.
Search Engines
What are search engines?
Search engines are web-based services that look for and identify items on the Internet – usually, web pages, images or videos – that best correspond to queries typed by the user.
What is the best search engine?
The best search engine is one that provides the most relevant results (you are actually interested in), in the fastest way possible, via an uncluttered, easy to use interface and offers additional options to broaden or tighten your search.
What is the most popular search engine?
With 79% share of the search market, Google is the most popular search engine, by far. In fact, one can claim it’s a monopoly. Other popular search engines include Bing, Baidu, and Yahoo.
Why is Google the most popular search engine?
Google became the most popular search engine because they came around with the right product at the right time.
First, Google developed a revolutionary technology that offered truly relevant search results. This, in the absence of any significant innovation from the then search leader Yahoo, allowed them to quickly overtake Yahoo and become the de facto search engine of choice.
Another factor that played into Google’s hands was that they came around during the same time as mainstream consumers were getting better Internet connectivity and becoming regular Internet users. By being the only search engine that delivered relevant results, Google built a massively loyal user base.
Finally, by successfully creating a very profitable source of revenue in offering pay-per-click advertising, Google was able to attract the best talent, keep innovating and also to create a massive war chest for future acquisitions, all of which they used quite effectively to cement their leadership position and protect themselves against competition.
How does Google work?
- Crawling and indexing – First, long before any search query is typed in, Google web page crawlers work relentlessly to discover and process information contained in web pages and other publicly available content. Once discovered, it stores and organizes that information in its own database, called the Search index.
- Search algorithms – When a search query is typed into Google, its ranking systems sort through hundreds of billions of webpages in the Search index to give you useful and relevant results in a fraction of a second.
- Useful responses – At the same time, Google makes sure to offer search results in a range of rich formats to help you find the information you’re looking for quickly.
What is web crawling?
Web crawling is the process of discovering new content on the Internet performed by a web crawler. A web crawler, sometimes called a spider, is an Internet bot used by search engines that systematically browses the Internet, looking for publicly available information.
How does Google web crawler work?
Google web crawler, called Googlebot, is a program designed to discover new and updated pages for inclusion in the Search index. Because of the enormity of the Internet, Google uses a huge set of computers to perform this task.
Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters.
As Googlebot visits each of these websites, it detects links on each page, and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Search index.
What is the meaning of web crawling in SEO?
Making sure that web crawlers, and specifically Googlebot, can and will access your website, is one of the cornerstones of any search engine optimization.
Put simply, until Googlebot crawls and indexes your web pages, there is no chance of you generating any free traffic from Google.
How do I get Google to crawl my website?
Googlebot uses an algorithmic (automated) process to determine which sites to crawl, how often to crawl them, and how many pages to fetch from each site.
There is no way to impact this process, as Google doesn’t accept payment to crawl a site more frequently.
However, one thing you can do is to ensure Googlebot knows about you. Here’s how:
- Make sure Googlebot isn’t blocked.
- Use the Submit URL option in Google Search Console.
- Create a Sitemap and submit it to Google Search Console.
- If you’ve recently added or made changes to a page on your site, you can ask Google to (re)index it using the Fetch as Google tool.
How to make sure Googlebot isn’t blocked
Blocking Googlebot from accessing a site makes Googlebot unable to crawl and index your website’s content. It can also lead to a loss of ranking in Google’s search results for previously indexed web pages.
If you suspect Googlebot may be blocked from accessing your site, log in to your Google Search Console account and check the following:
- Messages – Google usually displays a prominent message if you are blocking Googlebot from crawling your site.
- Crawl Errors – Review the list of crawl errors and look for any pages that you believe should be indexed. If there are such pages reported on the Crawl Errors list, it means Googlebot encountered a problem when it tried to crawl that URL.
- Fetch as Google – when you find a problematic URL, use the Fetch as Google function for more detailed information as to what the problem might be.
What is indexing in Google Search?
Google indexing is the process of adding your website’s information into the Search index.
Googlebot processes each web page it crawls and compiles a massive index of all the words it sees and their location on each page. In addition, Google processes information included in key content tags and attributes, such as Title tags and ALT attributes.
How long does it take for Google to index a new site?
On average, it could take anywhere from 2 days to 3-4 weeks for Google to index a brand new site. A lot depends on the quality of your website and work that was done optimizing it for search engines.
How do Google search algorithms work?
In order to analyze what it is you are looking for and what information to return to you, Google has to sort through hundreds of billions of web pages in their Search index in a fraction of a second. To do this, Google ranking systems use a series of algorithms that do the following:
- Analyze words – Google tries to understand the meaning of your search and decide what strings of words to look up in the Search index. This also involves interpreting spelling mistakes, understanding synonyms and applying some of the latest research on natural language understanding.
- Match a query – At the most basic level, Google analyzes how often and where keywords relevant to your query appear on a page. They also analyze whether the pages include relevant content or if they are written in the same language as your question.
- Rank pages – Using hundreds of factors, Google tries to identify the best web pages that match your search query. These include the freshness of the content, good user experience, website’s trustworthiness and authority, etc. Google also identifies and removes sites that violate their webmaster guidelines.
- Consider context – Information such as user’s location, past search history, and Search settings all help Google personalize results to what is most useful and relevant for that particular user in that given moment.
- Return best results – Before serving the search results, Google evaluates how all the relevant information fits together and then strives to provide a diverse set of information in formats that are most helpful for a given type of search.
Search engine ranking factors
How many factors does Google use to rank web pages?
When a user enters a query, the Google algorithm searches the index for matching web pages and returns the results they believe are the most relevant to the user. Relevancy is determined by over 200 factors.
What search engine ranking factors are the most important?
Google doesn’t release a list of their ranking factors to the public. Therefore, SEO experts have to rely on their own testing and (sometimes) guessing. According to a 2015 Expert Survey by MOZ, the most important search engine ranking factors were:
1. Domain-Level, Link Authority Features (importance: 8.22 out of 10) – these are factors related to incoming links (backlinks) for the entire website:
-
- Quantity of unique linking domains
- Topical relevance of linking domains
- Raw popularity of the domain as measured by PageRank, etc.
- Trust of the domain as measured by TrustRank, etc.
- Distribution of linking domains’ authorities/relative importance/popularity
- Backlinks from sites of your own geo-targeted area or language
- Percentage of links with brand terms in the anchor text
- Velocity of link acquisition to the domain
- Sentiment of the external links pointing to the site
2. Page-Level Link Metrics (importance: 8.19 out of 10) – these are factors related to incoming links (backlinks) for the page you want to rank:
- Raw quantity of links from high-authority sites
- Topical relevance of linking pages
- Topical relevance of linking domains
- Diversity of link anchor text to the page
- Raw quantity of links from known brands/entities to page
- Raw quantity of unique linking domains to the page
- Trust as measured by the distance from a trusted seed set of pages/sites
- Position/context of inbound link
- Popularity of the page as measured by algorithms like PageRank, etc.
- Link velocity of the page
- Raw quantity of links that employ the keyword as partial-match anchor text
- Raw quantity of links that employ the keyword as the exact-match anchor text
- Sentiment of the external links pointing to the page
3. Page-Level Keyword & Content-Based Metrics (importance: 7.87 out of 10) – these are factors that evaluate the quality and relevance of the web page’s content:
- Uniqueness of the content on the page
- Freshness of the content on the page
- Length of content on the page
- Page contains Schema.org or other structured data
- Reading level of the content on the page
- Use of images on the page
- Use of rich media (video, slides, etc.)
- Page contains Open Graph data and/or Twitter cards
4. Page-Level, Keyword-Agnostic Features (importance: 6.57 out of 10) – these factors assess other aspects of the web page that are not specifically related to the search query:
- Page is mobile friendly (for mobile rankings)
- Page Load Speed
- The age of the page
- Quality of supplemental content on page
- Page supports HTTPS / SSL
- Author authority of page
- Page is mobile-friendly (for desktop rankings)
5. User Usage & Traffic/Query (importance: 6.55 out of 10) – these factors evaluate the user experience, both on the domain and page level:
- Click-through rate from Google search results pages
- Quantity of searches for the keyword, brand name, URL, or domain name
- Pure bounce rate of the page
- Return visits to this page after initial query/click
- Overall design and/or user experience
- Dwell time or long click metrics
- Average browse rate
6. Domain-Level Brand Metrics (importance: 5.88 out of 10) – these are factors that assess the importance of the brand and its relations to the keyword:
- Search volume for the brand/domain
- Existence/quality of verified real-world business info
- Quantity of citations for the domain name across the web
- Quantity of co-occurrence keyword + brand across web
- Quantity of mentions of the brand/domain on social sites
- Popularity of business’s official social media profiles
7. Domain-Level Keyword Usage (importance: 4.97 out of 10) – these are factors that assess the presence and position of the keyword in the domain name:
- Keyword is the exact match root domain name
- Keyword is present in root domain name
- Keyword is closely related to domain name through entity association
- Keyword is the subdomain name
- Keyword is the domain extension
8. Domain-Level, Keyword-Agnostic Features (importance: 4.09 out of 10) – these factors evaluate the quality of content for the entire website, as a whole:
- Uniqueness of content across the whole site
- Use of responsive design and/or mobile-optimized
- Aggregated click-through-rate from Google SERPs for the domain
- Freshness of content on the site
- Aggregated page load speed for pages of domain
- Aggregated dwell time or long click metrics for domain
- Quantity of error pages crawled on the site
- Age of domain
- Domain is associated with high-authority authors
- Domain contains trust signal pages
- Domain lists contact information
- Quality of other sites hosted on the same block of IP addresses
- Character length of domain name
- Presence of hyphens used in domain name
- Length of time until domain name expires
9. Page-Level Social Metrics (importance: 3.98 out of 10) – these are factors that assess the web page’s performance on social media:
- Engagement with content/URL on social networks
- Raw count of Google+ shares and +1s associated with URL
- Raw count of Tweets associated with URL
- Raw count of Facebook likes and shares associated with URL
- Comments about the page on social sites
- Sentiment of social links and citations referring to the page
- Raw count of Pinterest pins associated with URL
- Upvotes for the page on social sites
What are negative ranking factors?
- Total number of unnatural links to a page/subdomain
- Page is duplicate content
- Page content is thin
- Amount of over-optimized anchor text on page
- Non-mobile friendly (for mobile SERPs)
- Keyword stuffing in document
- Page links to spam
- Page has relatively poor searcher satisfaction
- Slow page speed
- Page has relatively poor engagement/usage metrics
- Total amount of advertising on page
- Page links to high number of 404 pages
- Page assets (CSS/JS) are blocked by robots.txt
- Total number of times links from page and/or subdomain have been disavowed
- Page contains a duplicate title tag
- Non-mobile friendly (for desktop SERPs)
- Page contains a duplicate meta description
SEO best practices
Here are the best practices for search engine optimization as recommended by Google (PDF document). They apply to all websites, regardless of their topic, size, language, etc.
1. Improving Site Structure
1.1. Improve the structure of your URLs
Simple-to-understand URLs can lead to better crawling of your pages by Googlebot. It also makes it easier for those who want to link to or visit your content. Finally, a page’s URL is displayed as part of a search result in Google, right below the page’s title.
Do:
- Use words in URLs.
- Use a directory structure that makes it easy for visitors to know where they’re at on your site.
- Provide a single version of a URL to reach a given document.
Don’t:
- Use lengthy URLs with unnecessary parameters and session IDs.
- Choose generic page names like “page1.html”.
- Use excessive and/or repetitive keywords.
- Have deep nesting of subdirectories like “…/dir1/dir2/dir3/dir4/dir5/dir6/page.html”.
- Use directory names that have no relation to the content in them.
- Have multiple URLs that access the same content.
- Use odd capitalization of URLs.
1.2. Make your site easier to navigate
The navigation of a website is important in helping Google understand what content is important.
Do:
- Make it as easy as possible for users to go from general content to the more specific content. Add navigation pages when it makes sense.
- Use mostly text for navigation links and menus.
- Create an HTML sitemap, and use an XML Sitemap file.
- Have a custom 404 page that guides users back to a working page on your site.
Don’t:
- Create complex navigation menus or link every page on your site to every other page.
- Make users click through too many tiers to access a particular page.
- Have a navigation based entirely on drop-down menus, images, or animations.
- Allow your 404 pages to be indexed in search engines.
2. Optimizing Content
2.1. Create unique, accurate page titles
Besides being an important element of SEO, the title tag will usually also appear in the first line of the search results. If the words in the title tag match the words in the search query, those words are bolded. This helps users recognize that a page is relevant to their search.
Do:
- Choose a title that effectively communicates the topic of the page’s content.
- Create unique title tags for each page.
- Use brief, but descriptive titles.
Don’t:
- Choose a title that has no relation to the content on the page.
- Use default or vague titles like “Untitled” or “New Page”.
- Use a single title tag across all of your site’s pages or a large group of pages.
- Use extremely lengthy titles that are unhelpful to users.
- Stuff unneeded keywords into your title tags.
2.2. Make use of the “description” meta tag
Description meta tags are important because Google might use them as snippets for your pages in the search results. Just as with the title tag, words in the snippet are bolded when they match the user’s query.
Do:
- Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result.
- Use unique descriptions for each page.
Don’t:
- Write a description meta tag that has no relation to the content on the page.
- Use generic descriptions like “This is a web page”.
- Fill the description with only keywords.
- Copy and paste the entire content of the document into the description meta tag.
- Use a single description meta tag across all of your site’s pages or a large group of pages.
2.3. Offer quality content and services
Compelling and useful content will positively influence your website more than any other SEO factor, because satisfied users will likely want to direct other users to it through blog posts, social media, email, forums, etc.
Do:
- Write easy-to-read text.
- Stay on topic.
- Break your content up into logical chunks.
- Create fresh, unique content.
- Create content primarily for your users, not search engines.
Don’t:
- Write sloppy text with many spelling and grammatical mistakes.
- Embed text in images.
- Dump large amounts of text on varying topics onto a page without paragraph, subheading, or layout separation.
- Rehash or copy existing content that will bring little extra value to users.
- Have duplicate or near-duplicate versions of your content across your site.
- Insert numerous unnecessary keywords aimed at search engines.
- Deceptively hide text from users, but display it to search engines.
2.4. Write better anchor text
Anchor text – the clickable text that users will see as a link – is used by Google to judge what the destination page is about. It is an important SEO factor.
Do:
- Choose descriptive text.
- Write concise text.
- Format links so they’re easy to spot.
- Think about anchor text for internal links too.
Don’t:
- Write generic anchor text like “click here”.
- Use text that has no relation to the content of the page linked to.
- Link a lengthy sentence or a paragraph of text.
- Use styling that make links look like regular text.
- Use excessively keyword-filled anchor text just for search engines.
- Create unnecessary links that don’t help with the user’s navigation of the site.
2.5 Optimize your use of images
Using optimized images can help improve web page’s rankings and also bring extra traffic from the image search.
Do:
- Use brief, but descriptive file names and alt text.
- Supply alt text when using images as links.
- Supply an Image Sitemap file.
Don’t:
- Use generic filenames like “image1.jpg”.
- Write extremely lengthy filenames.
- Stuff keywords into alt text or copy and paste entire sentences.
2.6. Use heading tags appropriately
Headings create a hierarchical structure for your content, making it easier for users to navigate through your page.
Do:
- Use headings to communicate the page’s outline and hierarchy.
- Use heading tags where it makes sense.
Don’t:
- Place unhelpful text in heading tags.
- Use heading tags where other tags like <em> and <strong> may be more appropriate.
- Erratically switch from one heading tag size to another.
- Excessively use heading tags throughout the page.
- Put all of the page’s text into a heading tag.
- Use heading tags for purposes other than presenting structure.
3. Dealing with Crawlers
3.1. Make effective use of robots.txt
You can decide whether you want Google to crawl and index all of your pages or just some. A “robots.txt” file is one way to tell search engines whether they can access and therefore crawl parts of your site.
Do:
- Use more secure methods for sensitive content.
Don’t:
- Allow search result-like pages to be crawled.
- Allow URLs created as a result of proxy services to be crawled.
3.2. Be aware of rel=”nofollow” for links
Using the rel=”nofollow” attribute tells Google that certain links on your site shouldn’t be followed or pass reputation to the pages linked to. This is particularly useful for sites that have user-generated content, such as message boards or blog comments.
Do:
- Automatically add “nofollow” to comment columns and message boards.
- Use “nofollow” when you wish to reference a website, but don’t want to pass your reputation on to it.
Don’t:
- Accidentally “nofollow” all of your internal links.
4. SEO for Mobile Phones
4.1. Notify Google of mobile sites
Mobile sites not only use a different format from normal desktop sites, but the management methods and expertise required are also quite different.
Do:
- Verify that your mobile site is indexed by Google.
- Create a mobile Sitemap and submit it to Google.
- Allow “Googlebot-Mobile” user-agent to access your site.
- Check that your mobile URLs’ DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML.
Don’t:
- Disallow “Googlebot-Mobile” user-agent from accessing your site.
4.2. Guide mobile users accurately
One of the most common problems for webmasters who run both mobile and desktop versions of a site is that the mobile version of the site appears for users on a desktop computer, or that the desktop version of the site appears when someone accesses it on a mobile device.
Do:
- Redirect mobile users to the correct version or switch content based on user-agent.
- Make sure that the content on the corresponding mobile/desktop URL matches as closely as possible.
- Serve the same content to Googlebot as a typical desktop user would see, and the same content to Googlebot-Mobile as you would to the browser on a typical mobile device.
Don’t:
- Serve different content to Googlebot from what a typical desktop user would see, and different content to Googlebot-Mobile from what a typical mobile user would see.
5. Promotions and Analysis
5.1. Promote your website in the right ways
Effective promotion will lead to faster discovery by those who are interested in the same subject.
Do:
- Master making announcements via blogs and being recognized online.
- Make use of social media.
- Reach out to those in your site’s related community.
Don’t:
- Promote each new, small piece of content you create.
- Involve your site in schemes where your content is artificially promoted.
- Spam others with link requests.
- Purchase links from another site with the aim of getting PageRank instead of traffic.
5.2. Make use of free webmaster tools
Google’s Search Console helps webmasters better control how Google interacts with their websites and get useful information from Google about their site. It allows you to:
- See which parts of a site the Googlebot had problems crawling.
- Notify Google of an XML Sitemap file.
- Analyze and generate robots.txt files.
- Remove URLs already crawled by Googlebot.
- Specify your preferred domain.
- Identify issues with title and description meta tags.
- Understand the top searches used to reach a site.
- Get a glimpse at how Googlebot sees pages.
- Remove unwanted sitelinks that Google may use in results.
- Receive notifications of quality guideline violations and request a site reconsideration.
Web analytics programs like Google Analytics are a valuable source of insight for traffic analysis. You can use it to:
- Get insight into how users reach and behave on your site.
- Discover the most popular content on your site.
- Measure the impact of optimizations you make to your site.
- … and much more.
Conclusion
There is a lot more to SEO than what is covered in this guide, but if you do only part of what is recommended, you will be well ahead of most of your competition. Good luck!
Guest Author: Miroslav Chodak is a digital marketing expert with more than 20 years of experience. If you would like to know how SEO-ready your site is, learn how to do an SEO audit, or ask Miroslav to do one for you.
The post SEO Basics: Getting Started with Search Engine Optimization for Absolute Beginners appeared first on Jeffbullas’s Blog.
Read Full Article: http://bathseoexpert.tumblr.com/post/169504957211
Thanks for your information, it was really very helpfull.. best seo san diego
ReplyDeleteThis is very educational content and written well for a change. It's nice to see that some people still understand how to write a quality post.! free word counter
ReplyDeleteExactly, you're very kind of us about comment!.
ReplyDelete