Robots.txt File in SEO

  • 13 February 2020
  • George

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or to password-protect your page.


GET/POST request with Apache HttpClient

  • 27 January 2017
  • ADM

GET/POST request tutorial using Apache HttpClient. The tutorial is using httpbin.org test server.


How to create Factory Pattern in Java

  • 03 November 2016
  • ADM

How to create a factory design pattern in Java using the example of creating a database table for multiple database engines.


Get and Set Field Value using Reflection in Java

  • 31 October 2016
  • ADM

How to get and set field value using reflection in Java. The tutorial covers all situations for a field: public, private and protected.


How to execute method using reflection in Java

  • 28 October 2016
  • ADM

How to execute method using reflection in Java. Example with all cases, with/without parameters and returning/no returning a value.


GET/POST request with HttpURLConnection in Java

  • 17 October 2016
  • ADM

How to send GET and POST requests using standard HttpURLConnection class from Java. Simple examples for both methods.