Insights
Our 5 key takeaways from the Euroclear Collateral Conference 2024
Read our key insights from Euroclear Collateral Conference 2024
Around the world, the security risks posed by Advanced Persistent Threats (APTs) and complex state-sponsored attacks are making headlines with dire warnings echoed by everyone from security vendors to world leaders. Amidst the rush to defend networks against these targeted attacks, executed by skilled and organised operators, I find the fundamental question of “have I covered all bases to protect my company’s network from cyberattacks” is becoming all the more relevant.
It’s worthwhile to note that most attacks are not caused by highly skilled hackers. While the risks of ever more complex attacks are greater than ever, the most significant threats for most organizations are the old favourites. For example, SQL Injection, a technique identified in 1998, has remained top of the Open Web Application Security Project’s (OWASP) top 10 vulnerabilities for years. It uses malicious data to inject database queries and extract information and is the likely cause of many major breaches in recent years, including the TalkTalk breach in 2015.
It’s not just external vulnerabilities that should be examined – insider threat is as much a risk as ever. While the threat of malicious insiders leaps to mind first, a greater threat is a careless system administrator or support team member trying to be helpful – for example, opening firewalls or resetting credentials without verifying identity. Before making the decision to spend security budgets on the latest tools or platforms to create complicated defences against the most advanced threats, you should ensure you have the basics covered first – lock the back door.
This is often easier said than done. Public Cloud infrastructure makes this far more complicated and confusing for security professionals. The ease of system growth and relative difficulty in asset tracking means that maintaining an understanding of what has been deployed, and therefore what needs to be secured and defended is an ongoing job. Best practices such as dynamic asset discovery and alerting are now essential to understand the shape of your infrastructure. These can highlight changes in the system, especially those that haven’t been authorised or don’t conform to baselines and standards. And automated baseline and vulnerability testing can help ensure what has been deployed isn’t leaving an easy path into the critical parts of the system.
A strong DevOps team can help increase overall security. By developing a fully automated infrastructure as code, the team will be able to create reliable, auditable, and repeatable secure environments. With this in place, processes such as new infrastructure deployment can be fully automated through code releases and patching. This reduces the risk of manual error. Quick deployment of immutable infrastructure can also create a more secure network by redeploying on a regular basis. This ensures servers do not drift from the “gold standard” build and unintended changes or installations are not persistent.
While increasingly complex and targeted system breaches are making headlines, many data loss events could be prevented by ensuring that the easiest and most commonly attacked vectors are secured.
Scott Shearer
Head of Cyber Security
Taskize