Robots.txt File in SEO
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out
of Google. To keep a web page out of Google, you should use noindex
directives, or to password-protect
your page.
How to delete the Windows.old folder from Windows 10
If you recently update your OS to Windows 10, you might notice that you are running out of space. If this is happening it might be because the old version of the Windows is not yet deleted from the disk, it is stored into "Windows.old" folder. And depending on the size of that version, it could be taking a lot of precious space.
Java Custom Annotations Example
How to create Java Custom Annotation. A simple example of creating Annotations in Java and usage by calling the Reflection API.
How to execute shell command from Java
How to execute a shell command from Java using Runtime.getRuntime().exec()
method.
How to generate UUID / GUID in Java
How to generate UUID / GUID in Java using java.util.UUID
class, UUID.randomUUID()
method.