Quick SEO Tip – block your backlinks visibility

Quick SEO Tip - block your backlinks visibility

You spend hours and hours on research, find best sites, place your links there or build buffer sites… and your competitors are just copying your best activities? Don’t let them see what is going on here and just block their analytical capabilities. Let them work alone on their success!

How can we do this?

Every self-respecting SEO professional use web analytics tools. In terms of link-building popular ones are i.a.:

We use these tools to determine, which places are worth our online attention. I am talking both about building refferals to our website and links, that are outcoming from our site and pointing external websites (Yes, those are also really important! Properly used have huge impact on Trust Rank).

These tools use robots/crawlers to scan almost everything in the Internet (like Google right?) and we can simply block them. This lead to situation, in which our competitors using sites like ahrefs, MOZ, MAJESTIC won’t see our link-building activities. Sounds perfect, right? How to do this? We can use our .htaccess file (how to proceed information could be found here: A few optimizations in .htaccess.

A few examples of robots, that are worth blocking could be found in file: Web robots

Enjoy and have a good day!

Keep optimizin’!

Meta tags rocks!

Meta tags rocks

Meta tags are not recognized by Google as a direct SEO factor nowadays… but are still very important, when interacting with search engine user. So indirectly, by increasing CTR (Click Through Rate), they are important to SEO. Metas, when written properly can boost traffic on our website and result in higher rankings.

What are meta tags and where we can find them?

Firstly let’s have a look at what Google want to tell us about them:

Search Console Help

In a few words – most important and simplest ones are: TITLE, DESCRIPTIPN and KEYWORDS. First two appear directly on SERP. So, when we search for something, we always see them. They help us decide which website to choose, which will be the most valuable for us. Third one can be added optionally for Google robots.

Where can we find meta tags - google serp example

We do not write meta tags for Google bots nowadays, but rather use them to attract potential customers. But how can we determine if one set of meta tags (tile, description) is better than another?

There’s no easy answer – as probably always in SEO. We need to start thinking like our customers, feel their needs and just answer their questions there. A little tip is to google certain phrase or keywords, that we want to rank for… to see, what our competitors from TOP10 have used there. And just try to use something, that addresses our customers needs more completely.

Is Google always displaying meta tags, that that we have written in our HTML code?

The answer is no. Google have our site indexed and in most cases shows snippet, prepared from content, that is used on our website. So how can we tell Google to display our meta tags instead of ones generated automatically?

We just have to remember to
– make them attractive for our customers (TEST, TEST and another time TEST),
– prepare meta tags, that fully reflect content presented on our website,
– use between 135 and 150 characters,
– use some ‘call to action’ forms.

A few optimizations in .htaccess

.htaccess - Not too sexy, but necessary

Why worth reading further and keep optimizing:

  • You can do it on Your own, practically no technical skills required (but remember to have copy of Your .htaccess file in case something goes wrong)
  • Your websites will just rank higher (and that’s the reason why You’re here, am I right?)

 

A few example of .htaccess usage:

Redirecting site versions (www, without www, index.html)

Probably Your heard about ‘duplicate content’ and that it can destroy all of your SEO work. And having site versions with ‘www’ and without it and not redirecting them properly is recognized as ‘duplicate content’.

When You weren’t responsible for this part of web development (and don’t really know what going on here) type domain.com, www.domain.com and www.domain.com/index.html in your browser (or another index file type like index.php) and see what happens. If You always see single site version, for example without ‘www’ prefix it’s just good (by the way redirecting all site versions to version without ‘www’ is better idea from SEO perspective).

If no, we have some work here.

First how to remove index.html

If You have index.php – just change index\.html to index\.php

And now – redirecting from www to to withot www version:

301 redirections

301 redirections are commonly used to ‘tell search engine’, that we have permamently moved our site to new location. 301 redirection gives almost 100% link-juice from old domain to the new one and in most instances it’s the best method for implementing redirects on a website.

When we want to redirect traffic from /category/television/ to /category/tv/ we should put code like below in our .htaccess:

Caching

Caching is very important to SEO nowadays. Page speed  is one of main factors, used by Google to determine sites position. When we have already prepared Our website and don’t plan dramatic changes in its look in the near future, then we can ‘tell Google’ to have some site elements stored longer in users browser cache. In this way, when someone visits site second time, it’s loaded much faster!

Here’s some nice working piece of code for You. When You use WordPress just past it in You .htaccess before ‘Begin WordPress’. When using different platform or build site without any CMS platform take elements, which suits Your needs (images).

We can see, if it works and speeds our site up using Google Speed Insights. Tell me how Your score have changed in few minutes

Excluding traffic (bot) – ban certain IP addresses

Sometimes server is overloaded with tons of bot requests. We see, that our site is loading longer (for example in Google Search Console, in mentioned Google Speed Insights or when analyzing stats directly on server site), then it can be because of large amount of bot traffic.

Using .htaccess we can exclude traffic from specific IP addresses (responsible for bot traffic):

Last thing – very important. Protect Your .htaccess

And how to generate .htaccess file. You can do it easily using standard Notepad available in Windows. Just choose “save as” option, then choose file format “all files” (not .txt) and name file “.htaccess”. That’s all!

That’s all for my first post. I would be very gratefull, if have helped anyone! If You have any questions or just want to share opinions, discuss this topic/or also want to discuss different ones –  feel free to do it.

Have fun and just keep growing online!