Robots.txt File in SEO
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out
of Google. To keep a web page out of Google, you should use noindex
directives, or to password-protect
your page.
GET/POST request with Apache HttpClient
GET/POST request tutorial using Apache HttpClient. The tutorial is using httpbin.org test server.
How to create Factory Pattern in Java
How to create a factory design pattern in Java using the example of creating a database table for multiple database engines.
Get and Set Field Value using Reflection in Java
How to get and set field value using reflection in Java. The tutorial covers all situations for a field: public, private and protected.
How to execute method using reflection in Java
How to execute method using reflection in Java. Example with all cases, with/without parameters and returning/no returning a value.
GET/POST request with HttpURLConnection in Java
How to send GET and POST requests using standard HttpURLConnection class from Java. Simple examples for both methods.