Securing Websites

Below is a brief overview of steps you can take to help secure your website online. Security Audit Systems offer full website penetration testing services to help secure websites that you operate.

The Basics

Step 1)

Know your web application/website – One of the most important fundamental steps in building a secure website is to understand the site files, backend system and any other system files/processes that run your website. Gaining an understanding of what these files and processes are doing allows you to easily fix problems, remove plugins that aren’t required and check permissions are set correctly.

Step 2)

Server updates – Ensure that the operating system that your website sits is fully updated, running only the core services required to provide the website service you are trying to deliver is a critical step in securing your website. Leaving extra services running on a server allows potential hackers to probe for additional vulnerabilities when they try to gain access.

Step 3)

Backend and Services updates – Not only should you be securing the server platform, it’s vital to secure the backend systems that make your web application function. This could be anything from WordPress, Drupal, Joomla or another popular CMS. If these backends are not secured and updated regularly (including plugins) you will be left vulnerable to attack. Do not forget any additional applications you may have running on the website; forums, mail forms, news portals feeds etc, these are all common targets.

Extra Security

Step 4)

Defaults- Ensure that you run everything based on a ‘minimal install’ with only the functions you need enabled. Many CMS platforms come with lots of extra functionality that you may never use, simply disable or turn off the extra functionality or plugins. Look at changing defaults for admin usernames, database table prefixes, port numbers for things like Secure FTP/SSH. Disable root login/admin access by default and use super user accounts.

Step 5)

Securing website permissions – Make sure you have strong restrictions to directories, that are more aggressive to access than the defaults. You can install security plugins that will adjust directory and file permissions for you if you are unsure what to change do a bit of research on Google. Make sure your robots.txt file has only the directories listed that you want Google to crawl, so other ‘hidden’ directories do not appear on a Google search, as these could be picked up by the Google Hacking Database attack techniques.

Step 6)

Old scripts / monitoring – If you use plugins that are suddenly unsupported or no longer updated, be sure to remove them from your system and look for an alternative solution.