Robots.txt File in SEO
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out
of Google. To keep a web page out of Google, you should use noindex
directives, or to password-protect
your page.
How to create a Java project with Maven
How to create a Java project with Maven. Simple steps to generate the project, import in Eclipse and run the application.
How to skip Maven unit tests
How to skip Maven unit test using Maven feature maven.test.skip and surefire plugin property skipTests.
Apache Maven Basic Operations
Apache Maven Basic Operations contains short description on maven commands like, clean
, package
, install
, deploy
, mvn eclipse:eclipse
How to include library into Maven local repository
How to include library into maven local repository using mvn install command. Useful for libraries that are not available in public repositories.
Eclipse proxy authentication required
Recently I encountered this problem: not being able to install a new plugin from "Eclipse marketplace.." or"Install new software...". The problem displayed is "HTTP Proxy Authentication Required". The problem was because my computer was behind a proxy server.