Since more and more enterprises are migrating their workloads to the public cloud, it is essential to understand the fundamental security challenges they can face in their path, especially when they are choosing Amazon Web Services to accomplish their dream. Let me discuss this case with a real-life example.
A company named Code Spaces suffered from a big cyber attack made on their AWS account in 2014. Attackers compromised their control panel and asked for money. Once Code Spaces refused their proposal, they started deleting all their resources such as S3 buckets data, AMIs, EBS snapshots and instances, they even erased all their backup data since it was also controlled through cpanel. Now nothing’s left- the attackers brutally murdered the company.
A few organizations are adopting hybrid cloud technology to manage their work over the public cloud and on-premises infrastructure. It becomes challenging when it comes to monitoring both on-premises and public cloud infrastructure. According to a report presented by a cybersecurity community- the total damage caused by cyber attacks globally will hit $6 trillion annually by the end of 2021. It was estimated to be $3 trillion per year in 2015. Any obstacle which can cause downtime for few hours will also lead to a massive loss for an organization such as- in August 2016, Delta Airlines faced tens of millions lost due to a 6-hour downtime in their application. The worst scenario happened with Code Spaces.
Listed below are a few tips, which on implementation will make your AWS cloud highly secure and reliable. Let’s explore:
Carefully Decide IAM Roles:
Identity and Access Management(IAM) roles are used to set permissions at multiple levels across your AWS resources. If you create an Elastic Cloud Compute (EC2) instance then you can assign an IAM role to access it, rather than providing the key provided to establish API requests. It is one of the best services offered by AWS for protecting your infrastructure. In case, your IAM gets compromised, you don’t need to cancel the whole credentials officially.
There are few things to remember before assigning IAM roles to anyone:
- In IAM policies, instead of providing excessive permissions to a single person, try to attach a group of roles. It will ensure more safety by eliminating unnecessary permissions.
- Make sure you provide minimal access privileges for IAM role. Only provide access to resources according to the user’s job responsibilities.
- Teach your employees to ensure the multi-level authentication for their account. Also, ask them to put a strong password for authentication and try to change them from time to time.
- Try to rotate the IAM access keys on a regular basis. Make sure no one uses expired passwords.
Define Virtual Private Cloud (VPC):
An AWS VPC allows you to launch your AWS resources inside your defined virtual network. It is more like you are operating through your own data center. Here, you can define the IP range for VPC, add security groups such as various security protocols- sFTP, HTTPs etc., routing tables and subnets. Your network isolates with protected boundaries to reduce attack risks.
Monitor VPC Flow Logs:
The VPC flow logs feature of AWS allows a user to keep the record of all incoming and outgoing traffic through a defined network interface within VPC. Further, this recorded information can be published to CloudWatch and simple storage service (S3). Flow logs generate data from each network node flowing across source and destination such as port number, sender and receiver’s address, bytes consumed, overall timing, accepted/rejected traffic, overall traffic rate or the total number of packets. Thus, you will able to know which audience/traffic is unable to reach your instance, assisting you with network diagnosis to modify security rules.
Here, you can monitor suspicious traffic and compromised indicators to quickly respond to an incident.
Limit Your Database Access and Follow Best Security Practices for Storage:
To get a closer understanding of the security related to the database, it is compulsory to know who is accessing it and for which motive. Think about more than regular employees’ access. E.g., if you are accessing an outside data source from your application, you need new controls like data integrity validation, encryption, confidentiality before this data stores in your database. AWS offers various database services to its users such as relational database service(RDS), DynamoDB, elastic block store(EBS), Aurora(MySQL DB), simple storage service (S3), Redshift(petabyte storage data warehouse), and ElastiCache for the in-memory cache. For AWS database and storage security you can adopt the following set of rules:
- Do not make your S3 bucket publicly readable/writable.
- Perform an audit on saved logging details in the database.
- Encrypt all the data stored in RDS and EBS.
- Restrict access to created instances.
Customize Security with Third Party or Native Tools:
Your IT team can leverage varieties of tools and security audits from AWS or other third party service providers to address the main security concerns. AWS provided tools are: Artifact, Certificate Manager, Cloud Directory, Cognito, CloudHSM, Firewall Manager, GuardDuty, Macie, Inspector, Shield, WAF etc. One can choose these tools as per their requirement to ensure the safety of their cloud application and data. You can even define your own set of rules and policies to ensure security. If you want to become an expert in these services, you can go through the AWS Technical Essential Training accessible always over the internet. Besides, the AWS marketplace offers varieties of third-party tools for security.
DevOps Plan to Enhance Security:
The difference in opinions of DevOps and security teams will dominate the strategies for your IT team. They both together form theDevSecOps, a team which will manage testing, automation and application release efficiently. Automating and aligning security in DevOps pipeline will achieve a faster result. Thus, the developing team will able to automatically initiate the build after pushing code to version control as the first phase of the DevOps method. Obstacles faced in the unit and integration testing are trashed out as they emerge because DevSecOps verifies each phase without affecting the speed of deployment. With IAM and security automation provided by free tools, it has become possible to secure the pipeline. It allows your IT team to perform full security check-ups to analyze vulnerabilities in the environment.
AWS services like Inspector, CloudTrail, CloudWatch, Lambda functions enable a user to automate and assess security in the cloud environment. Thus, together these services can prevent big to small attacks in your network.
Encrypting Relational Database:
Whenever you deploy your database in AWS RDS, don’t forget to mark the enable encryption checkbox as it adds another security layer to your RDS data. The best part is that you don’t need any extra customization.
Use Load Balancers:
Whenever your application has massive workloads, it is good to use Elastic Load Balancers. It assists you with auto-scaling and encrypts your traffic. Apart from auto-scaling, it also stores access logs and uses a web application firewall.
Protect your Access keys and Passwords:
Password and access keys (mostly .pem files) are two primary credentials used to authenticate your account. They both are valid for root user and individual IAM users’ login. So it’s your responsibility to protect these two credentials. Never embed these credentials in your publicly accessible code directory. In addition, you can always update all the security credentials regularly. Also, trying to protect your account with multi-factor authentication is the best practice to protect your access credentials because it only has API access. These API commands require MFA token to continue the work. If you get any hint or you have any suspicion that your credentials are stolen, quickly change or delete those credentials.
Final Words:
Security is the first step of a successful business journey. Adopting these strategies and their successful implementation in your cloud environment will boost your business performance and protect you from the cyber storm.
Great article and also very helpful, especially for me that i’m new in the field of business journey. Thanks!