Robots.txt File in SEO

  • 13 February 2020
  • George

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or to password-protect your page.


How to setup a proxy with authentication for HTTP client in Golang

  • 22 March 2020
  • ADM

How to setup a proxy with authentication for HTTP client in Golang example using http.Transport and http.Client. Simple example calling httpbin.org test server.


How to setup a proxy for HTTP client in Golang

  • 05 March 2018
  • ADM

How to setup a proxy for HTTP client in Golang example using http.Transport and http.Client. Simple example calling httpbin.org test server.


HTTP URLConnection with proxy in Java

  • 08 February 2018
  • ADM

How to create an HTTP URLConnection with proxy in Java. Simple example including proxy authentication.


How to Check Alexa Rank in Java

  • 16 November 2017
  • ADM

How to Check Alexa Rank in Java using the undocumented API: http://data.alexa.com/data?cli=10&url=domainName. The code contains also the possibility of adding proxy settings


How to fix briefly unavailable for scheduled maintenance in Wordpress

  • 06 October 2017
  • ADM

How to fix briefly unavailable for scheduled maintenance in Wordpress: by removing the .maintenance file from the installation root folder.