Robots.txt File in SEO
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out
of Google. To keep a web page out of Google, you should use noindex
directives, or to password-protect
your page.
How to disable submit button with jQuery
For a long submit operation or to avoid users clicking twice a submit button, a common solution is to disable the submit button after user clicked on it.
To disable a button using jQuery you need to add the disabled
attribute to the button as true.
How to change Tomcat default port
How to change Tomcat default port 8080 to any other port. Three steps: locate server.xml file, edit and restart the Tomcat service.
Tomcat default administrator password
How to activate the default tomcat manager application administrator password. This setting is for Tomcat 7 and newer.
How to setup a proxy with authentication for HTTP client in Golang
How to setup a proxy with authentication for HTTP client in Golang example using http.Transport and http.Client. Simple example calling httpbin.org test server.
How to generate JAXB classes with xjc
How to generate JAXB classes with xjc. Command line: xjc -d src -p com.admfactory.client schema.xsd