Robots.txt File in SEO

  • 13 February 2020
  • George

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or to password-protect your page.


How to generate JAXB classes with xjc

  • 05 March 2018
  • ADM

How to generate JAXB classes with xjc. Command line: xjc -d src -p com.admfactory.client schema.xsd


HTTP URLConnection with proxy in Java

  • 08 February 2018
  • ADM

How to create an HTTP URLConnection with proxy in Java. Simple example including proxy authentication.


How to Check Alexa Rank in Java

  • 16 November 2017
  • ADM

How to Check Alexa Rank in Java using the undocumented API: http://data.alexa.com/data?cli=10&url=domainName. The code contains also the possibility of adding proxy settings


JAXB Hello World Marshalling / Unmarshalling Example

  • 03 August 2017
  • ADM

JAXB Hello World Marshalling/Unmarshalling example: Object to XML and XML to Object.


How to create tiny HTTP server in Java

  • 08 May 2017
  • ADM

How to create tiny HTTP server in Java using ServerSocket class.