Robots.txt File in SEO
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out
of Google. To keep a web page out of Google, you should use noindex
directives, or to password-protect
your page.
How to delete the Windows.old folder from Windows 10
If you recently update your OS to Windows 10, you might notice that you are running out of space. If this is happening it might be because the old version of the Windows is not yet deleted from the disk, it is stored into "Windows.old" folder. And depending on the size of that version, it could be taking a lot of precious space.
How to change DNS settings from command line
The command line tool used to change the DNS settings is netsh
which allows you to configure just about any aspect of your network connections in Windows. In order to use the tool the command line needs to be started as Administrator.
How to change computer's IP address from command line
The command line tool used to change the IP address is netsh
which allows you to configure just about any aspect of your network connections in Windows. In order to use the tool the command line needs to be started as Administrator
How to Install GoLang 1.10 on Ubuntu
How to Install GoLang 1.10 on Ubuntu. This can be applied to any Golang version and for any Ubuntu version.
HTTP URLConnection with proxy in Java
How to create an HTTP URLConnection with proxy in Java. Simple example including proxy authentication.