Cookie Into the (open) source


WordPress: migrate from one sub-directory to the root

Today's article is a bit 'technical, I had to delve into some trends that I was too light for some trouble when I installed WordPress combined (the engine of this blog) su

Here below the steps Blog with WordPress support to move an existing installation of WordPress from within a folder, the root domain. As in my case it was installed in WordPress "", the file index.php in the root directory instead was a simple redirect 301 alla directory "/wordpress".


The keyword “(not provided)” in Google Analytics

Google Analytics is a powerful tool available to us free of charge thanks to Google. It offers a myriad of possibilities from the possibility of monitor access, to the frequency control with which a visitor returns to the site, the Click sections, the click less and also keywords that after the route to the site search.

Since Google has implemented SSL on users logged in, however,, appeared voice "(not provided)" in my statistics. At the beginning it was a very low percentage to which I have given due weight, after several years of operation, however,, I find the voice "not provided" to the 64,52%, now third among all research.


Some tips in the use of “.htaccess”

not_found.jpg Arriving at the third consecutive post its ".htaccess"I think it's appropriate to say that the topic intrigues me enough. Today we will see some tips and customizations in the use of ".htaccess”, improving performance of our server.

Filed under: SEO, Tips, Web Continue reading

Use the file “.htaccess” per bloccare l’hotlinking

broken_link.jpg We continue to talk about ".htaccess”, Today we see how block hotlink to images that reside on our server.

Yesterday we talked about as to prohibit access to our images from external sites (as often happens with forum the amateur sites), that do nothing but increase the use of the band, reducing the resources of our server. In the case of yesterday, the user will see the symbol of missing image, those who try to connect directly to the site I get the error "403 Forbidden”.

Today we want to have fun instead. Would not it be funny if the user instead of seeing the image missing seeing another image that urges him to do not link from our server? This time we will create a file ".htaccess"Directly in the images directory. And 'possible to create more ".htaccess"On a server. If you can create one for each directory. Let's see how it will be our for images:

Filed under: SEO, Tips, Web Continue reading

Use the file “.htaccess” to block the access

googlebot1.jpg For anyone who happened to monitor the access of a site will ever come across very similar to each other access, often it is crawler (Google itself uses a sophisticated type of crawler called GoogleBot). Paraphrasing Wikipedia:

One crawler (also called spider the robot), is software that analyzes the content of a network (or a database) in a methodical, automated, typically for account of a search engine.

Crawlers are usually harmless, create a no traffic within the site (if done well) to offer Indexing Service that we all know and appreciate. [more]

However there are crawlers that, using the same mechanisms of the crawler indexing, scanning the web looking for flaws in the code pages. As we know webmasters are not always careful in planning, and sometimes we are aware of some security breaches Site (The portal). These harmful crawlers explore the length and breadth of the web, indexing the pages for themselves in order to "pierce"The site, and have access the server or worse sensitive data.

Also, in addition to not make us a good indexing service, increase the use of the band, forcing the browser to visit the site more slowly. Well ... over insult to injury.

However, it is possible via a simple text file block access to certain IPs or "user agent" once identified. I'm talking about the file. "Htaccess".

The file ". Htaccess" is a useful configuration file to the server, a very simple tool, but equally powerful, and can not be used lightly. An error in the configuration file may inhibit the webmaster access to their pages, for which Andiamoci cautious.

The safest way to know if the "user agent"Who has made a recent visit to the site is harmful crawler is a search on Google. Let him separately "user agent” e l’IP address from which we received the request.

Locking the bot through. "Htaccess"

This example, and all subsequent, can be added at the bottom of the file ".htaccess”, always it has been created. If it does not already exist you can create it: a simple text file that we will call ". htaccess" will put in "root directory"The server.

#Let's get rid of the bot %{HTTP_USER_AGENT} ^BadBot
RewriteRule ^(.*)$ http://go.away/

What does this piece of code? Simple. The few lines above tell the server to control all access whose "user agent” beginning with "BadBot”. When it finds one that coincides redirects to an address that does not exist called "http://go.away/”.

Now let's see how to block more than one:

#Let's get rid of the bots %{HTTP_USER_AGENT} ^ BadBot [OR]
RewriteCond %{HTTP_USER_AGENT} ^ EvilScaper [OR]
RewriteCond %{HTTP_USER_AGENT} ^FakeUser
RewriteRule ^(.*)$ http://go.away/

The code above does exactly the same things that made the first, in this case blocks all "user agent"That begin with"BadBot”, “EvilScraper”, “Fake User”. Note that when there is more than one bot to lock you need to put "[OR]"At the end of the line of all the rules, except that the last.

Locking the "thieves" bandwidth

Anyone sailing usually does not know, but it often happens that to be lighter on your own server (or simple ignorance) some webmasters include images of residents elsewhere in your pages. This lightens the load on the server that does not have the burden of hosting the image, but weights traffic on the server where the image resides, not to mention that the second server has no advertising from the work done.

Since we can not afford to change in sequence the images on our website, also in this case ".htaccess"There is help.

RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://.*somebadforum\.com [NC]
RewriteRule .* - [F]

In this way ""Will be redirected to a code403 Forbidden"For each image included. The end result will be the classic symbol of missing image (broken image), e nostro server-side saves SARÀ.

To lock more than one site this is the code:

RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://.*somebadforum\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*example1\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*example2\.com [NC]
RewriteRule .* - [F]

As above, note on the finish of each line "OR”, except for the last.

Ban the IP

It can also happen that you do clever bot, and to change their rotation "user agent”, in order to have continuous access to the pages of the site. When this phenomenon happens one way to block access to the bot "imaginative" is block IP (lock only if there is a continuous access from the same IP). Also in our trusty ".htaccess"Add the following lines:

order allow,deny
deny from
deny from
deny from
allow from all

In this example we block three IP addresses, with the last line guarantee access to all other. But it is also possible to block access to the root of the address (this is. 192.168.*):

order allow,deny
deny from 192.168.
deny from 100.29.
deny from 500.699.
allow from all

With these rules, all IP addresses that begin with "192.168.” (and the following) will be blocked.

Suggest always to create a backup of ".htaccess”, things do not always go the way we want, and would not be useful to anyone not having access to their server. The most common hoster offer support to file ".htaccess”, in case your hoster not offer such support believe is the case of cambiare hoster.

Filed under: SEO, Web 4 Comments