Robots.txt File in SEO

  • 13 February 2020
  • George

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or to password-protect your page.


How to delete the Windows.old folder from Windows 10

  • 30 July 2020
  • ADM

If you recently update your OS to Windows 10, you might notice that you are running out of space. If this is happening it might be because the old version of the Windows is not yet deleted from the disk, it is stored into "Windows.old" folder. And depending on the size of that version, it could be taking a lot of precious space.


Java Custom Annotations Example

  • 26 July 2018
  • ADM

How to create Java Custom Annotation. A simple example of creating Annotations in Java and usage by calling the Reflection API.


How to execute shell command from Java

  • 20 June 2018
  • ADM

How to execute a shell command from Java using Runtime.getRuntime().exec() method.


How to generate UUID / GUID in Java

  • 19 June 2018
  • ADM

How to generate UUID / GUID in Java using java.util.UUID class, UUID.randomUUID() method.


How to read XML file in Java using SAX Parser

  • 04 April 2018
  • ADM

How to read XML file in Java using SAX Parser.