Governments around the world are making bold plans for achieving cyber resilience, with many setting the year 2030 as the target date for achieving an unprecedented level of infrastructure and asset protection.
The UK’s Government Cybersecurity Strategy, for example, points out that its viability as a cyber power depends on cyber resilience and sets a goal of “the whole public sector being resilient to known vulnerabilities and attack methods no later than 2030.” Australia’s strategy calls for a whole-of-nation approach to making the country “a world leader in cyber security by 2030.” Other countries are following suit, and help is being offered by the Cybersecurity Futures 2030 initiative.
In the United States, the most recent US National Cybersecurity Strategy set goals only for 2024-2025, but the Cybersecurity and Infrastructure Security Agency (CISA) Secure by Design initiative, launched in April 2023, promotes adoption of many of the same cybersecurity best practices being pursued by other countries, the UK and Australia among them.
2030 is only five short years away, and cybersecurity efforts, whether national, international or specific to organizations, still face many of the same barriers that have always hindered comprehensive security. So, the question remains: Will they be able to hit their 2030 cyber resilience targets?
Resistance to Resilience: The Barriers in the Way
The barriers to effective cybersecurity include familiar suspects such as budgetary and resource limitations, the growing complexity of modern systems and the challenges of keeping up with rapidly evolving cyber threats. But at the top of the list for many organizations is the shortage of cybersecurity skills among workforces who are already understaffed and overburdened.
The most recent Cybersecurity Workforce Study from ISC2 found that, although the size of the global cybersecurity workforce swelled to 5.5 million workers in 2023 for an increase of 9% over a single year, so did the gap between supply and demand, which rose by 13% over the same period. But it’s more than just a numbers gap; the study found that the skills gap is an even greater concern, with respondents saying the lack of necessary skills was a bigger factor making their organizations vulnerable, versus just the number of people on hand.
The current approach is flawed, especially considering how we’ve seen it play out in private enterprise. The grand plans that governments have for cybersecurity will require significant uplifts to security programs, including dramatic improvements in developer upskilling, skills verification and guardrails for artificial intelligence tools that organizations have failed so far to effectively implement.
Organizations need to modernize their approach, implementing pathways to upskilling that use deep data insights to provide the best skills verification possible. They need to manage and mitigate the inherent risk that developers with low security maturity bring to the table.
Developer Risk Management at the Heart of a Secure Future
A recurring element in these 2030 cybersecurity plans is the importance of ensuring that organizations and people can trust digital products and software.
If governments want their plans to succeed, they need to set an example for industry and public-sector organizations to follow. Developer education requires an investment, but the payoff is significant.
Developers with the skills to create secure code, as well as correct any insecure code created by AI assistants or supplied by third parties, have been shown to substantially reduce the number of software vulnerabilities. Stopping vulnerabilities at the start of development also saves time and money on software fixes, which can take 15 times longer if done at the testing stage and it can take up to 100 times longer if left until after a program is deployed. Ultimately, secure practices advocated by CISA’s Secure by Design and other initiatives increase developer productivity, improve the SDLC workflow and spur innovation, all while reducing risk.
An effective program would provide ongoing, hands-on education in real-world scenarios delivered in a way that accommodates developers’ work schedules. It would establish the baseline skills developers need, use internal and industry benchmarks to measure progress and identify those areas that need improvement. And it should be designed so that it can evolve alongside the threat landscape. Finally, it’s essential that organizations are able to prove that the upskilling program has succeeded by effectively measuring outcomes.
Managing developer risk via upskilling and education isn’t the only step organizations — or nations — need to take, but it is a key foundation for creating a robust culture of security.
Conclusion
For security leaders worldwide to keep pace with both emerging technology and threats, they must finally overcome the barriers that have traditionally hampered their success. The skills shortage is perhaps the biggest of those barriers. Developers skilled in secure practices and working in tandem with security teams rather than separately, can bring security into the earliest stages of the SDLC, where fixes are easiest to achieve.
But that won’t happen if developers don’t have the critical cybersecurity skills and knowledge they need. Closing the critical skills gap with upskilling and targeted developer training will go a long way to helping governments and other organizations around the world meet their ambitious 2030 cyber resiliency goals, ensuring both a brighter and safer future for us all.