Robots.txt File in SEO
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out
of Google. To keep a web page out of Google, you should use noindex
directives, or to password-protect
your page.
How to generate JAXB classes with xjc
How to generate JAXB classes with xjc. Command line: xjc -d src -p com.admfactory.client schema.xsd
HTTP URLConnection with proxy in Java
How to create an HTTP URLConnection with proxy in Java. Simple example including proxy authentication.
How to Check Alexa Rank in Java
How to Check Alexa Rank in Java using the undocumented API: http://data.alexa.com/data?cli=10&url=domainName. The code contains also the possibility of adding proxy settings
JAXB Hello World Marshalling / Unmarshalling Example
JAXB Hello World Marshalling/Unmarshalling example: Object to XML and XML to Object.
How to create tiny HTTP server in Java
How to create tiny HTTP server in Java using ServerSocket class.