How to clean /tmp directory automatically in Linux/cPanel using tmpwatch

The “tmpwatch ” command in linux is to removes files which haven’t been accessed for a period of time. The tmpwatch recursively removes files which haven’t been accessed for a given time. Normally, it’s used to clean up directories which are used for temporary holding space such as /tmp.

If you are noticing “/tmp” getting overloaded with files and not sure which files/folders to delete – you better use “tmpwatch” to cleanup or delete files/folders from the “/tmp” directory.

You’ll need SSH root access to install tmpwatch and add it in the cron. If your server is inaccessible due to “/tmp” getting full – you may restart the server and that should free up some space after reboot.

  1. Login to the server as root using SSH
  2. Run the following command:

    #yum install tmpwatch -y

  3. To delete temporary files (for example after every 12 hours) run the following command:

    #/usr/sbin/tmpwatch -am 12 /tmp

  4. The next step is to configure tmpwatch to run automatically through a cron. To do that type the following command:

    #crontab -e

  5. The above command will open the cron job list for the user root. Now go to the bottom and add the following line and save the file:

    0 4 * * * /usr/sbin/tmpwatch -am 12 /tmp

    If you are unable to add the above line, you may navigate to “/var/spool/cron” and open the cron file “root” with a text editor (such as, vi, nano). Add the line at the bottom and save the file:

    0 4 * * * /usr/sbin/tmpwatch -am 12 /tmp

Check the usage of “/tmp” and it should be clean by now.

Thank you.

How to force redirect http to https using .htaccess

how to force http to https redirection htaccess

If you have an SSL certificate installed and would like to force your website to load with https URL – you may add the following code in the .htaccess file (Linux based hosting):

RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301,NE]

You need to place the .htaccess file in the home directory of your website.

 

10 best Web Hosting companies – Plans & Pricing 2019

# 1


hostgator logo

HostGator is the largest shared hosting service provider in the whole world. There are good reasons for which millions of clients host there websites with HostGator. There service quality and support is superb and that is what set's them apart from others. Following are their Shared hosting packages:

shared hosting packages of hostgator

Besides Shared hosting - HostGator offers: Cloud Hosting, WordPress Hosting, VPS & Dedicated Servers.

# 2


siteground logo

SiteGround is popular hosting service provider which is known for its rock solid services and unbeatable support. Their shared hosting packages are as follows:

shared hosting packages of siteground

 

Besides Shared hosting - SiteGround offers: Cloud Hosting, WordPress Hosting and WooCommerce Hosting.

# 3


siteground logo

Eleven2 is a US-based web hosting service with a global presence. It was founded in 2003 and its office address is in Los Angeles. Eleven2 offers two types of shared hosting - Shared hosting with Standard storage and Shared hosting with SSD. Eleven2's shared hosting packages are as follows:

shared hosting packages of eleven2
SSD shared hosting packages of eleven2
Eleven2 Offers other services like: Reseller Hosting, SSD Reseller Hosting, Virtual Private Servers and Cloud Hosting.

# 4


bluehost logo

Bluehost is a web hosting company owned by Endurance International Group. It is one of the 20 largest web hosts, collectively hosting well over 2 million. If you are looking for a good web hosting company, you have probably heard of Bluehost. They are a popular and quickly growing web hosting company. BlueHost's shared hosting packages are as follows:

shared hosting packages of bluehost

Bluehost Offers other services like: Cloud Hosting, WordPress Hosting, Virtual Private Servers.

# 5


ipage logo

iPage is a popular web hosting platform for a very big number of websites. It is one of those inexpensive hosting providers who provides quality hosting services with great support. iPage shared hosting packages are as follows:

shared hosting packages of ipage

iPage Offers other services like: Dedicated Hosting & VPS Hosting.

# 6


justhot logo

JustHost is a simplified hosting provider which provides hosting packed with a lot of features - backed by their extremely good support. JustHost shared hosting packages are as follows:

shared hosting packages of justhost

JustHost doesn't offer any other services - but they do offer useful marketing tools (Google Ads credit) to get started well.

# 7


7uphost logo

7upHost is global web hosting service provider operating from Bangladesh. Established in 2008 it has been serving clients from all over the world. They offer rock solid hosting with a friendly support. 7upHost shared hosting packages are as follows:

shared hosting packages of 7uphost

7upHost offers all kinds of web hosting besides shared hosting: SSD Hosting, Reseller Hosting, Virtual Private Server, Dedicated Server, Managed AWS Hosting, Google Cloud, GSuite and Professional Email Hosting services.

# 8


A2 Hosting logo

A2 Hosting is much talked web hosting service provider now a days for their diversified hosting services. Their shared hosting packages are as follows:

shared hosting packages of A2 Hosting

A2 Hosting offers a variety of services like: WordPress Hosting, Reseller Hosting, Virtual Private Server, Dedicated Server.

# 9


DreamHost logo

DreamHost is one of those veteran web hosting providers still popular across the globe. Their shared hosting packages are as follows:

shared hosting packages of dreamhost

Other Services by DreamHost: Cloud Hosting, Virtual Private Server, Dedicated Hosting.

# 10


GreenGeeks logo

GreenGeeks is wellknown as an environment freidnly hosting service Provider. Their datacenters are powered by renewal energy sources. They are also reputed for their incredible hosting services and support. Their shared hosting packages are as follows:

shared hosting packages of greengeeks

GreenGeeks offers a variety of web hosting solutions, including shared hosting, reseller hosting, virtual private server hosting and dedicated Hosting.

Whitelist IP or IP range in/out using iptables

#Flush existing rules
iptables -F

# Set up default DROP rule for eth0 (Assuming eth0 is the Ethernet Port)
iptables -P INPUT DROP

# Allow existing connections to continue
iptables -A INPUT -i eth0 -m state –state ESTABLISHED,RELATED -j ACCEPT

# Accept everything from the 192.168.0.x network
iptables -A INPUT -i eth0 -s 192.168.0.0/24 -j ACCEPT

# Allow connections from this host to 192.168.1.10
iptables -A OUTPUT -o eth0 -d 192.168.1.10 -j ACCEPT

GMail Phishing scam – Becareful !!

If you are a tech savvy person – you probably know how phishing works. However, this time attackers tried to use optical illusion to trap people into a new phishing scam targeting the GMail users. Here’s how it looks like:

gmail phishing scam URL

Scammers attach an image which looks like it is an attachment. When you click on it – it takes you to the scammers destination – to the phishing link. And if you ignore the login URL – you are DONE!!

The phishing login page has been tricked as well to look like it is from Google. So even the most savviest person may fall for the trick. Here is how the phishing page URL will look like:

gmail phishing scam data url

Img credit: The Hacker News

 

So, next time you are forwarded to any login page – double – triple check.

You may find more about it in this link.

Restrict Search Engines from finding your website

While everybody is busy getting their website more and more exposed to the search engines (SEO) – you might want to restrict search engines to crawl your website.

You may have your own development website or may have your company’s web based HR system – which you don’t want to expose to the world. Search Engines will find your website if you do not restrict them. If you want to restrict the complete website just upload a file named “robot.txt” in the root folder and add the following line in it:

User-agent: *
Disallow: /

If you want to restrict a specific part (files in a specific directory) – add the following line:

User-agent: *
Disallow: /yourdirectory

If you want a certain Search Engine to be restrcited – (for example Google): add the following line:

User-agent: googlebot
Disallow: /yourdirectory

Some of the other Search Engine Bots are as follows:

MSN/Bing: bingbot
Yahoo: Yahoo Slurp (Currently using bingbot).
Baidu: Baidu Spider