Feb 132020

George

Robots.txt File in SEO

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or to password-protect your page.

Mar 102018

ADM

How to Install GoLang 1.10 on Ubuntu

  • 10 March 2018
  • ADM

How to Install GoLang 1.10 on Ubuntu. This can be applied to any Golang version and for any Ubuntu version.

Mar 052018

ADM

How to generate JAXB classes with xjc

  • 05 March 2018
  • ADM

How to generate JAXB classes with xjc. Command line: xjc -d src -p com.admfactory.client schema.xsd

Oct 062017

ADM

How to fix briefly unavailable for scheduled maintenance in Wordpress

  • 06 October 2017
  • ADM

How to fix briefly unavailable for scheduled maintenance in Wordpress: by removing the .maintenance file from the installation root folder.

Jul 122017

ADM

How to import and export Java Projects in Eclipse

  • 12 July 2017
  • ADM

How to import and export Java Projects in Eclipse in simple and intuitive steps.

Jun 152017

ADM

Installation failed: Could not create directory in WordPress

  • 15 June 2017
  • ADM

When you get the "Installation failed: Could not create directory." in WordPress, you can follow these two easy steps.