Basic SEO Guide: Optimizing a Website For Search Engines?

Would you like to increase visits to your website or improve your site’s ranking on Google but you don’t know anything about SEO (Search Engine Optimization)? In this  basic guide you will find a series of tips and practical advice to start optimizing your website for search engines, and some insights dedicated to writing content for your website.

How Does a Search Engine Work?

Below you will not find a complete article on the construction of search results (SERPs), but simply an overview of what are the factors that come into play within a search engine.

We are used to thinking that a search engine comes into operation when a user makes his  “query” , or “search question”.

After the user queries the search engine, it comes into operation and extrapolates the results, sorting them and then showing them to the user.

This is the simplest and most logical explanation that is expected when it comes to the results offered by a search engine, but in reality the procedures behind the extrapolation and cataloging of search results are more complicated.

Generally, in fact, after a user does his research in the search engine, entering his keywords , The engine switches to the  deduction of the argument , that is, it tries to understand what topic we are talking about (let’s remember that each word can have different meanings – in this regard, on the subject of the meaning of words, today we talk more and more of  SEO Semantics  – just think of the word “Football” which can refer to sport or nutrition …), then try to understand what the user’s intentions are   and then pass the ball to his  sector algorithms (they can be algorithms dedicated to photos, videos, texts, …).

After passing the information to the sector algorithms, the search engine, since 2012, is able to  customize the results  based on user preferences, browsing data, our web history and other factors. Therefore, once you have carried out:

  • Query
  • Deduction of the argument
  • Deduction of the user’s search intent
  • Filter using specialized algorithms
  • Personalization of results

the engine finally shows us the search results. This demonstrates how  complicated it is to position a website  at the top of search engines, and makes us understand how the factors of  SEO  vary according to the search topic and the results present in the search engine archives.

Among the various factors that influence the construction of SERPs (Search Engine Ranking Page), between 2010 and 2012, there was a  user-side categorization  which, through the search logs, allowed search engines (and Google in particular) to understand what the interests of users are and insert them in certain “categories”.

In essence, Google has managed to profile, and continuously works on the profiling of each individual user. For example, if a user often searches for cars and news on the world of cars, google will understand that we are passionate about cars and will put us in the category of people interested in the world of cars.

As a result, Google sorts its results based on the user’s profile when creating a search results page. So what to do when we create a website?

Before thinking about invasive placements or trying to dominate the SERPs that we consider most interesting, let’s try to understand which “category of users” our site is useful for, and therefore try to  categorize our site  in such a way as to reach our users – guy.

How do I understand according to Google which category my site belongs to?

Google, even if you don’t know it, has already categorized your website. To check which category has entered your website Google you can use various tools, for example SEOZoom.

It is clear that to categorize a site on a certain category I need to have content for that particular category. It is useful to use AdPlanner, SEOZoom or other tools also to understand how those we consider our “competitors” are categorized and try to be categorized by google in a similar way.

Other factors affecting ranking?

There are those who, in 2015, still believe in the “200” positioning factors. In reality, there are many more factors, and I don’t know how much it could be useful to list them all, for this reason I decided to list only the most “popular” and discussed on the web.

Among the  factors that affect the positioning  of a website, and therefore among the factors taken into consideration by search engines, there are:

  • Domain name
  • Domain age (how many years has it been online)
  • Site structure
  • Loading speed
  • Optimization for mobile devices / Responsive layout
  • On page optimization
  • Internal linking / Link building
  • Author Rank
  • Quotations – Quotations
  • Content marketing activities

How to optimize a site for search engines?

Before starting to delve into the world of  link building  you should have a well-structured and optimized site: the advice I always give to those who ask me “how to position a site on search engines” is first of all working on the site, on those that are they call onsite factors  :

  • Site structure (Responsive Template, Plugin, Permalink, Categories, Contents)
  • Site optimization (loading speed, ease of navigation)
  • Contents (Learn to do keyword research , learn to organize content)

In addition to external factors, in fact, the first thing to take care of is the internal organization: a site that loads quickly, that is  usable and that has excellent content always makes the difference on search engines .

10 Tips to Increase Your Visibility on Google

  • Register your site on Google Search Console: The Search Console offers several free tools that will allow you to have detailed information on the visibility of your pages on Google and beyond;
  • Put yourself in the users’ shoes: try to browse your site as a reader and make sure that the site is simple to use and that the pages and content you write have the words that users are really looking for
  • Use the right headlines: Every page and every article should have a good headline that grabs your attention and answers a reader question. Within the article there should also be various paragraphs (h1, h2, h3) to structure the text correctly.
  • Provide valid metatags and descriptions: Plugins like  Yoast SEO allow you to change the description that will appear for your content on Google: providing a good description will allow you to have a more attractive and interesting “Snippet”, improving the CTR of your page on the engines of research.
  • Beware of Permalinks: Google itself has always said that URLs should be readable . You should set interesting and simple URLs, a practical example is the URL of this article: www.monetiando.com/guida-seo-base/
  • Structure the internal links correctly: it is always Google, already in 2011, to advise the owners of websites and blogs to  make sure that particularly important pages are associated with a link on another page of the site already indexed by Google.
  • Don’t use much JavaScript and Flash:  Set your site’s core content and navigation elements as text, optionally using JavaScript and Flash to optimize or highlight them . Google prefers sites that are light and quick to load, not filled with banners, flash, JavaScript, PopUp and PopUnder!
  • Use ALT attributes for images: it is easier for Google to identify the content of an image if it has the ALT attribute (also useful if the image is corrupted for any reason and the user, instead of seeing an error, can read the ‘alt tag of the image).
  • Responsive Site: Try to see how your site loads on various mobile devices, make it Responsive and possibly Retina Ready.
  • Give your users the best: Google, in its guidelines, explains that  the main purpose of a site is to offer the best experience to your users, measure the changes you make to the site with Google Analytics and with the tool to optimize the site .

The domain and structure of the website.

The first step should definitely be the choice of the domain. Because we must not only think about search engines but also about users, and if the domain is not easy to remember (or can not  get into your head ) then we are probably already starting on the wrong foot.

Once you have chosen the domain (and I remember that there are those who buy expired domains) it is very important to  choose a good hosting ,  choose a good CMS  (today we know that the most used is WordPress) and a good  template for wordpress  (or a theme that has structured data, is written with clean code, is fast loading and not too “messed up”, by the way … it  must be responsive or mobile friendly / first ) and avoid installing too many plugins (add-ons), trying to do so to monitor the loading speed of our website.

Once all this has been set up, you can work on the  structure of the site  (known is the advice to create a tree structure, do not go too deep with the subcategories and use the tags in an intelligent way and not as  if there were no tomorrow! ) . Within the structure of the site, parts such as the main pages should be considered (with optimization of the titles and the various h1, h2, h3,…, optimization of the descriptions, internal linking of the main pages,…). URLs must also be “SEO Friendly”  (who knows, maybe you forgot to set permalinks.

Content is king?

With the experience gained over the years, I can’t tell you if it’s really “Content is king”, but I can tell you that quality content, over time, is increasingly rewarded by search engines (and, above all,  by users ): I’m not here to discuss what  quality  is as it is always considered rather subjective, but the quick advice I want to give you is to  write something that is useful for users in a language that anyone can understand , otherwise you will not be read.

Remember to insert bolds, bullet points where necessary / useful, space the text well, create a good editorial plan    and remember to take advantage of  internal linking.

It is important to create both “current” content (which can bring traffic in the short term) and content that can guarantee “traffic stability over time” as they are always useful content. By the way,  don’t talk about any topic but remember to pick your niche  (and maybe expand it slowly over time).

Once you have created the content,  do not expect the miracle  but promote your content  using social networks  ( and without SPAM ) and groups, communities and forums. Remember to give away some of your great content  (i.e. to write guest posts only for sites of very high quality and fame) to prove what you are worth, and do not skimp on tips and suggestions.

Once you have created good content, try to contact people who talk about the same topic as you and suggest the content you have created:  the inbound links are important  (i.e. it is important that people talk about your content and your site, both by entering the link either commenting or quoting you from other experts)

Usability: Make your article read (and your site visited)

You can write the most beautiful articles in the world, but if the font is 8 pixels, rest assured that… nobody will read them.

Work on the usability of your website, try searching for “your best articles” yourself and browse your site as a user: can you find a lot of information? Is it hard to find your hottest articles  ? Work on usability , text spacing, font size,… In short, make your site attractive. And never forget to experiment and analyze the results .

Why is nobody sharing my content?

You have written a fantastic article and are waiting for people to share it. But did you share it? I know a lot of people who hope to share… without them sharing the best content they have published.

If you don’t do it why would anyone else do it? Start setting a good example (by the way, you can also share good content from your competitors, there’s nothing wrong with that, isn’t it?)….

What about link building?

In my opinion, link building is a real art. It is not for everyone. Not everyone knows how to do it. And above all it is not an exact science. If you don’t know how to do it, do  n’t experiment with your customers . Create your own “pilot” projects on which to test and refine your techniques. Avoid SPAM  (link building does not mean spam, it means building a solid reality on the net), learn to create solid relationships and remember to compare yourself (in a polite way) with those who know more than you.

This is just the beginning my dear Semola, if you are intrigued by the world of SEO you can read various manuals, but nothing will train you more than experimentation and comparison with serious professionals.

 How to Avoid Broken Links on the Site with WordPress?

One of the problems of many internet sites is certainly the presence, within the site, of “broken” or not working links.

During the life of a blog or a website it is natural (and right) to link to other websites, but it happens that, after years, some domains are no longer renewed, or some pages are deleted, and this causes the increase of “broken links” within our sites.

A broken link is not only a “negative experience” (if we can define it that way) for a user who visits our site (and who by clicking expects to find something useful and not a page that no longer exists): it can give the the impression that the website is not really “so nice” and, in any case, Google  does not like it.

If a website has a lot of links pointing to broken, wrong or broken addresses, this could be a problem in the long run, but fortunately, the problem is easily fixed.

With WordPress, in fact, you just need to install a very simple and very functional plugin:  Broken Link Checker:  once installed and activated, within our site (which clearly works with a wordpress platform), a “Link Checker” item will appear on “settings”: it will be him automatically to periodically check for corrupted links and to suggest us to make changes (delete the link, update it, …).

Google TrustRank: What is it?

Premise: Why TrustRank is born.

TrustRank was created to combat web pages considered Spam. It is an algorithm that allows you to control and manage links pointing to web pages, thus avoiding showing users pages that are considered Spam.

The websites are thus divided into high trust sites, low trust sites (credibility) and spam sites.

Initially, TrustRank was a trademark registered by Google that was used to combat the phenomenon of  online scams and phishing . Yahoo later adopted the term to talk about techniques for fighting internet spam.

What is the TrustRank? The TrustRank is an analysis of the links and contents of a website that allows us to understand which are the authoritative websites and which are the Spam ones. If our site is linked by sites with strong trust (by definition important national sites such as for example sites of municipalities, regions, newspapers such as Repubblica, Corriere in Italia,….) most likely our site is to be considered of quality, vice versa, if the sites that link to us have low trust, our site will not have much credibility.

In summary, TrustRank classifies pages and websites and gives them more or less authority.

But the Trust Rank of a website serves to give credibility and relevance not only to a specific page or to the domain in general, but to all the articles and content on the site. It should also be noted that in 2008 google gave up the “TrustRank” registered trademark.

In an old video from 2011 you can hear Matt Cutts explaining what Google means by “Trust”:

Google’s algorithms analyze web pages that are not classified as Spam (they are called “Seed”, or “Seeds”), the (human) experts analyze the pages by assigning scores and indicating to the algorithms which pages are to be included in the spam category and which ones are of high quality; the algorithm proceeds accordingly by classifying the other pages.

How to acquire TrustRank, or how to acquire authority

Hubs are the sites that connect the analyzed site and the authority of a site. Sites are considered authoritative if:

  • They are very up to date;
  • They have a lot of quality inbound links;
  • They have incoming links not only to the home page but also to other pages;
  • They have few outbound links;
  • They point in outbound links to specific, non-generic resources.

What are the factors that increase the Trust Rank?

  • The presence of an SSL certificate;
  • The “Hacker Safe” Logo;
  • The domain name registered for a long period (10 years);
  • The contacts and addresses of the site owner can be found;
  • The privacy Policy;
  • Receiving authoritative links.

SEO books

If you want to deepen your SEO knowledge you will probably be interested in knowing which are the  best SEO books on the market.

First of all I can’t help but tell you about my book dedicated to SEO:

Certainly one of the books that made history, perhaps for being one of the first complete books published on the subject, is  SEO Power by Giorgio Taverniti , a book written by the founder of the historic  FORUM GT.

Other certainly noteworthy books are  SEO and SEM by  Marco Maltraversi, then the famous  The Art of SEO  by  Eric Enge and  Stephan Spencer and  Link Building by  Ivano di Biasi

It is very easy for a user to “hide” their searches from the owners of websites: the results not provided are all the results of users who search after logging in on google (with gmail, with youtube, with google +, …) : just log in to hide the keywords used in the search from the site owners.

Today, many internet sites, when they access their own statistics, see that the “not provided” are about 60 – 80% of the total traffic to an internet site.

What does the loss of this data really mean?

In short, the loss of this data means:

  • Change the way we measure the SEO performance of websites and optimization work;
  • That organic traffic from google cannot be tracked at the keyword level via google analytics;
  • That we can however extrapolate a limited number of keywords through Google Webmaster Tools;
  • We no longer have the ability to see traffic data on “long tail”, “keyword group” and “brand” words;
  • There has been a loss of visibility of the new keywords that are emerging on our site and on which to create new opportunities / contents;
  • We need to use different metrics to understand SEO performance;
  • We need to increase the number of words to search for ranking and evaluate them with the URLs that we best perforated within our site.

How to measure the success and performance of SEO operations?

SEO professionals use a combination of: rankings, traffic, conversions as metrics and KPIs to measure the results of the work in progress. The metrics still available today are:

  • An overview of the traffic that comes from search engines;
  • The total number of conversions from organic or URL traffic;
  • Critical search terms;
  • Some keywords that are driving traffic to our site.

It is important today to measure which are the pages that bring the most traffic (which receive the most traffic) from the organic search within the site and try to understand which are the keywords related to these pages.

Use Google WebMaster tools to understand what the click / impression variations are for keywords, bearing in mind that this data is not 100% accurate and is available for a small percentage of keywords.

Redirect 301 and Canonical

In 2009 Matt suggested using rel = canonical when a 301 redirect could not be used.

But, writes one user, 301 compromises performance because it requires browsers to make another “trip” to my server. Is it not possible to use rel = canonical everywhere instead of Redirect 301?

Matt Cutts replies quite diplomatically: You are the owner of your website so you decide when to use the 301 redirect and when the rel = canonical. But most of the time I (Matt Cutts) recommend using the 301 Redirect.

This is because everyone knows how to offer and call it. Users, browsers, search engines know this. Furthermore, the 301 Redirect is usually used because you are migrating something of your site to a new location (a new location).

How to Quickly Index Content on Google?

Is it possible to force the indexing of a website or a page, content, article, on Google?

The time taken by Google to index a web page varies based on numerous factors, but it is possible to force Google to “go to our site” to view content.

This is a system that must be used in moderation and that allows you to request the passage of Google bot from our site immediately: it is a lawful practice (it is not black hat SEO ) that allows you to analyze a resource immediately.

In fact, just open the Google Search Console and then go to “Scan / View As Google”.

From this page we can therefore enter the URL of the page or site to be indexed, choose between desktop and mobile then click on “retrieve” or “retrieve and view”: in this way Google will retrieve the content of the page and we can click on “Send to Index” button then make our indexing / reindexing request.

Negative SEO How to Report Unnatural Links to Google?

They called it the “disavow links tool”

First of all you can clearly check your recent links, check all the links pointing to your site and, generally, you receive (this is a feature that has been around for some time) a message from Google that tells you more or less: ” Unnatural, malicious, or low quality links have been found on your site, ”and you are advised to remove them.

Generally, what needed to be done was to try to contact the sites linking to our site, asking them to remove the links to our site.

Matt Cuts points out that if you have not adopted aggressive SEO strategies,   it is not a problem to remove some links that have low quality.

The  Disavow Links Tool , or the Tool for Disclaiming Links, as also explained in the video, must not be used massively, but it must be used, apparently, only in particular situations or extreme cases.

It is also evident that it would have been more convenient for SEOs and website owners to have a convenient check mark next to the list of URLs linking to a website to communicate to Google “consider it valid / this disavow”.

It should therefore be an important weapon to fight not only those who link us without our knowledge, but also those who want to do reverse SEO (i.e. implement black hat operations on competitors’ sites to automatically raise the main site).

Difference Black Hat SEO and White Hat SEO

Below you can see an example of Black Hat SEO practices and White Hat SEO practices: this is not a complete list but some examples that will make you understand what are the differences between those who work in a “clean” or white way and who instead tries at all costs to position itself on search engines.

Black Hat SEO Strategies

  • Duplicate Content
  • Invisible Text and Keyword Stuffing
  • Cloaking or Redirection of users to other sites or pages
  • Links from sites with non-relevant content

White Hat SEO Strategies

  • Creation of relevant value content
  • Well-labeled images
  • Links and References from Relevant Sites
  • Complete, simple, grammatically correct sentences
  • HTML Compliant
  • Unique and Relevant Page Titles

Google Algorithms from 2003 to 2012

The evolution of Google and its search engine.

Algorithm updates From 2003 to 2007, they aimed at targeting internet sites that excessively optimized the pages of a website (over optimization on page).

– 2003 –

Google dance : (January 2003) Monthly updates that aim to improve search results (Boston, Cassandra, Esmeralda, Fritz) and cause a website to frequently change position in the SERPs.

Florida : (November 2003) Internet sites that do keyword staffing and over optimization on page techniques are hit and identified.

– 2004 –

Austin : (January 2004) The algorithms that recognize black hat techniques on page (hidden text and keyword stuffing inside the meta tags) are improved;

Brandy : (February 2004) Updates that aim to expand the index, work on a semantic indexing (LSI) and increase the attention to the relevance of the text of a link (Anchor Text). Google improves its understanding of synonyms and keyword analysis.

– 2005 –

Nofollow: (January 2005) Google, Yahoo! and Microsoft introduce the “nofollow” attribute with the aim of checking the quality of incoming links. The attribute instructs search engines not to follow certain links (including spam links and blog comments).

Allegra: (February 2005) We talk about Sandbox and LSI, several webmasters notice changes in the SERPs but the goal of this algorithm is not understood, perhaps Google is starting to penalize suspicious links.

Bourbon: (May 2005) Bourbon is supposed to change the yardstick of duplicate content within a website.

Local Maps : (October 2005) Big algorithmic update that affects the ranking factors for Local SEO.

Jagger: (October 2005) With a series of updates google hits low quality links, reciprocal links, link schemes and link sales.

2006 – NOT RECEIVED : From the infographic it does not seem that there has been any algorithmic update in google (very strange!)

– 2007 –

Universal Search (May 2007). This is not a real algorithmic update, but google now inserts News, images, local and other vertical engines in the serp. It is said that from this moment the old serp with 10 results ceases to exist.

– 2008 –

Suggest (August 2008) Google puts search suggestions in a box, while users are typing a keyword, it will evolve over the years into google instant.

The updates carried out between 2009 and 2010 seem to give great importance to the ranking signals linked to a brand.

– 2009 –

Vince (February 2009) Big update that seems to favor big brands.

Rel Canonical Tag (February 2009) – Google, Microsoft and Yahoo! announce support for the Canonical tag, through which webmasters can send canonicalization signals to search bots without affecting visitors (rel = “canonical”).

Caffeine (August 2009) Google releases a major infrastructure change designed to accelerate crawling and expand the index by integrating indexing and ranking in near real time.

Real Time Search (December 2009) – Provides real-time content for Google News, Twitter and new content that is indexed alongside other sources within real-time feeds (including social media depending on the topic).

– 2010 –

May Day (May 2010) – Webmasters notice a decline in long-tail traffic between April and May. Matt cutts confirms that this algorithm impacts long tail queries (sites with large amounts of dirty articles are hit).

Brand Update (August 2010) – Not a real algorithmic update, Google shows multiple results from the same domain in the Serp.

Google Instant (September 2010) – Search results are shown as you type a query. The impacts on SERPs are actually not very significant, the update simply helps to give results faster to users.

Social Signal (December 2010) – Google and Bing use social signals as ranking factors.

Between 2011 and 2012, social signals increase in importance within Google’s ranking algorithms.

– 2011 –

Attribution (January 2011) – Google, to combat the rise in Spam cases, launches an update that helps solve content attribution problems. Forerunner of the Panda …

Panda (February 2011) – Sites with low quality content, content farms, sites with a lot of advertising compared to content and sites with content quality problems are affected (arrives in Europe in April 2011).

+1 Button (March 2011) – Google launches the +1 button. This button affects search results within social circles (Also for paid ads).

Schema (June 2011) – Google, Yahoo! and Microsoft announce consolidated support for structured data, creating schemas to enrich search results.

Google+ (June 2011) – Google launches google +, the innovations introduced are the circles of users and the integration with Google products (at the beginning only Gmail, today in continuous expansion).

Pagination elements (September 2011) – to prevent crawling content and duplicate content problems google introduces rel = ”prev” and rel = ”next” attributes.

Freshness (November 2011) – Google announces algorithmic update of content freshness (impact 35% of queries).

– 2012 –

Search + Your World (January 2012) – Google announces the change of Google+ social content settings, which are now integrated into user profiles and appear in SERPs (customizations can be disabled via a button).

Ads Above the fold (January 2012) – Google updates algorithms related to page layout and devalues ​​sites with too much advertising at the top (above the fold).

Venice (February 2012) – An improvement of organic results localized and integrated with research data is carried out.

Penguin (April 2012) – Over Optimization Penality and WebSpam Update: the algorithm acts on spam factors including keyword stuffing.

Knowledge Graph (May 2012) – Additional data integrated into the SERPs to get information about some people, movies, places, music, …

DMCA Penalty Pirate Update : (August 2012) Sites with repeated copyright infringements are penalized.

Exact Match Domain (September 2012) – The way we evaluate the exact match of domains with the topic of the site has been changed.

Do you need advice? Contact me.

This article will be updated constantly, to stay up to date you can add it to your favorites or subscribe to the Monetizing.com® Newsletter.

Good work and good positioning on search engines,

Read More

Scroll to Top