Planning for post-quantum cryptography: Impact, challenges and next steps

Symmetric vs. asymmetric cryptography

Encryption algorithms can be classified into one of two categories based on their use of encryption keys. Symmetric encryption algorithms use the same secret key for both encryption and decryption. Asymmetric or public-key encryption algorithms use a pair of related keys. Public keys are used for encryption and digital signature validation, while private keys are used for decryption and digital signature validation.

Different types of encryption algorithms have different benefits and downsides. For example, symmetric encryption algorithms are often more efficient, making them well-suited to bulk data encryption. However, they need the shared secret key to be shared between the sender and recipient over a secure channel before message encryption/decryption can be performed.

Asymmetric cryptography is less efficient but does not have this requirement. Encryption is performed using public keys, which, as their name suggests, are designed to be public. As a result, asymmetric algorithms are often used to create a secret channel over which a shared symmetric key is established for bulk data encryption.

Asymmetric cryptography and “hard” problems

Asymmetric encryption algorithms are built using a mathematically “hard” problem. This is a mathematical function where performing an operation is far easier than undoing it. For example, a commonly used “hard” problem in asymmetric cryptography is the factoring problem. Multiplying two large prime numbers together is relatively “easy” with polynomial complexity. In contrast, factoring the result of this multiplication is “hard” with exponential complexity.

This difference in complexity makes it possible to develop cryptographic algorithms that are both usable and secure. Public key encryption algorithms are designed so that legitimate users only perform “easy” operations, while an attacker must perform “hard” ones. The asymmetric complexity between these operations makes it possible to choose key lengths for which performing the “easy” operation” is possible, while the “hard” operations are infeasible on modern computers.

Impacts of quantum computing on asymmetric cryptography

The security of public-key cryptography depends on the “hardness” of these underlying problems. If the “hard” problem (factoring, logarithms, etc.) can be solved with polynomial complexity, then the security of the algorithm is broken. Even if the complexity of breaking cryptography is hundreds, thousands, etc., times more difficult than using it, an attacker with sufficient resources and incentives (nation-states, etc.) could perform the attack.

Quantum computing poses a threat to asymmetric cryptography due to the existence of Shor’s algorithm. On a sufficiently large quantum computer, Shor’s algorithm has the ability to solve the factoring problem in polynomial time, breaking the security of asymmetric cryptography…[…] Read more »….

 

5 Critical Considerations in Building a Zero Trust Architecture

Zero Trust is everywhere. It’s covered in industry trade publications and events, it’s a topic of conversation at board meetings, and it’s on the minds of CISOs, CIOs and even the President.

What is Zero Trust, and why is it important?

Zero Trust isn’t a cybersecurity solution in and of itself. However, implementing a Zero Trust architecture will help mitigate and ultimately lower the number of successful cybersecurity attacks your organization might otherwise endure, greatly reducing operational and financial risk.

What is Zero Trust?

A Zero Trust security model, simply put, is the idea that anything inside or outside an organization’s networks should never implicitly be trusted. It dictates that users, their devices, the network’s components, and in fact any and every packet that holds a stated identity, should continuously be monitored and verified before anyone or anything is allowed to access the organization’s environment – especially its most critical assets.

This concept is the exact opposite of the old “trust everything if it’s in my zone” model that many IT models operated under in years past. Today, Zero Trust takes a “trust nothing unless it can be verified in multiple ways” approach to security.

How do you build a Zero Trust architecture?

If you’re considering implementing a Zero Trust model in your organization and want to better understand how to get started, John Kindervag, the creator of Zero Trust, outlines these five practical steps.

Step 1: Define your protect surfaces.

Most organizations understand the concept of the attack surface, which includes every potential point of entry a malicious actor might try to access in an attempt to compromise an organization.

Protect surfaces are different. They encompass the data, physical equipment, networks, applications and other crucial assets your organization wants to deliberately protect, given how important they are to the business.

Why take the protect surface approach instead of looking at the entire attack surface? Kindervag puts it simply: “Protect surface becomes a problem that’s solvable, versus a problem, like the attack surface, that’s actually unsolvable. How could you ever solve a problem as big as the internet itself?”

It’s essential to first identify the assets within your environment that require protection. Where does the most sensitive data reside? What operational technology is most critical to your plant and production processes? Make a list of those assets that you absolutely must prioritize from a security and access management standpoint and prioritize them.

Step 2: Map the transaction flows

Once you’ve identified your protect surfaces, you can start to map their transaction flows.

This includes examining all the ways in which various users have access to those assets and how each protect surface interacts with all other systems in your environment. For example, a user might be able to access terminal services only if multi-factor authentication (MFA) is implemented and verified, the user is logging on at an expected time and from the expected place and doing an expected task.

With your protect surfaces identified, prioritized and transaction flows mapped, you’re now ready to begin architecting a Zero Trust environment. Start with the highest priority protect surface and when completed, move to the next. Each protect surface with a Zero Trust architecture implemented is a high-quality step toward stronger cyber resiliency and lowered risk.

Step 3: Architect a Zero Trust environment

Keep in mind: no single product delivers a complete Zero Trust architecture. Zero Trust environments take advantage of multiple cybersecurity tools, ranging from access controls like MFA and identity and access management (IAM), to technology that protects sensitive data through processes like encryption or tokenization.

Beyond a toolbox of security technologies, every Zero Trust architecture essentially starts with creating smart, detailed segmentation and firewall policies. It takes those policies and then creates multiple variations based on attributes like the individual requesting access, the device they’re using, the type of network connection, the time of day they’re making the request and more – step by step, building a secure perimeter around each protect surface.

Step 4: Create a Zero Trust policy

This step focuses on creating the policies that govern activities and expectations related to things like access controls and firewall rules.

Think beyond posting those new policies to your organization’s intranet, too. Consider educational programs that you may need to implement throughout the organization to promote strong security practices among your employees, vendors and consultants. Frequent cyber-awareness training has moved into the mainstream, becoming a necessity that will help reduce risk.

Step 5: Monitor and maintain the network

The final step in Kindervag’s process focuses on verifying that your Zero Trust environment and the policies governing it are working the way you intended, identifying gaps or areas for improvement and course-correcting as necessary…[…] Read more »

 

Building a risk management program

In today’s world, it’s important for every organization to have some form of vulnerability assessment and risk management program. While this can seem daunting, by focusing on some key concepts it’s possible for an organization of any size to develop a strong security posture with a firm grasp of its risk profile. We’ll discuss in this article how to build the technical foundation for a comprehensive security program and, crucially, the tools and processes necessary to develop that foundation into a mature vulnerability assessment and risk management program. 

 

Build the Foundation

It’s impossible to implement effective security, let alone manage risk, without a clear understanding of the environment. That means, essentially, taking an inventory of hosts, applications, resources, and users.

With the current computing environment, that combination is apt to include assets that reside in the cloud as well as those hosted in an organization’s own data center. Organizations have little control over their remote employees’ devices, who are accessing data on a bring-your-own-device (BYOD) basis, adding another layer of risk. There is also the aspect of software as a service applications (SaaS) that the organization uses. It’s essential to know what data is kept where. With SaaS, in particular, teams must have a clear understanding of who is responsible for the security of the data in contractual terms, so as to allocate resources accordingly. 

 

Manage the puzzle

Once the environment is scoped, managing it relies on three main components: visibility, control, and timely maintenance. 

Whether it is software vulnerabilities, vulnerable configurations, obsolete packages, or a range of other issues, a vulnerability scanner will show the security operations team what’s at risk and let them prioritize their reaction. That said, scanners, external or internal, are not the only option. At the high end, a penetration testing team can probe the environment to a level that vulnerability scanners can’t match. At the low end, establishing a process to monitor public vulnerability feeds and verifying whether newly exposed issues affect the environment can provide a baseline. It may not give as deep a picture scanning, or penetration testing, but the cost in SecOps time is often well worth it.

Protecting the users is a major point and doesn’t always get the attention it deserves. Ultimately, that starts with user education and establishing a culture that enhances a secure environment. Users are often the threat surface that presents the greatest risk, but with proper education and attitude they can become an effective layer of a defense depth strategy.

Another important step to protecting users is adding multi-factor authentication (MFA). In particular, those that require a physical or virtual token tend to be more secure than those that rely on text messaging or email. While MFA does add a minor annoyance to a user’s login, it can drastically reduce the threat posed by compromised accounts and reduce the organization’s overall risk profile.

User endpoints are another area of concern. While the default endpoint protection included in the main desktop operating systems (Windows and MacOS) are quite effective, they are also the defenses every malware writer in the world tests against. That makes investment in an additional layer of endpoint protection worthwhile. 

The last major piece here is a patch management program. This requires base processes that not only manage the patch process, but also the assets themselves. Fortunately, there are multiple tools available that can enhance and automate the process, and a regular patch cycle can have vulnerabilities fixed before they are even developed into exploits.

Ideally, the patch management process includes a change management system that’s able to smoothly accommodate emergency situations where a security hotfix must go in outside the normal window.

Pulling it all together

With the foundation laid, the final step involves communication. Simply assessing risk is not useful if there is no reliable way to organize people to act on it.

Bridging the information security teams, who are responsible for recognizing, analyzing, and mitigating threats to the organization, and the information technology teams, who are responsible for maintaining the organization’s infrastructure, is vital. Whether an organization achieves this with a process or a tool is up to them. But in either case, communication is vital, along with an ability to react across teams. This applies to non-technical teams as well — if folks are receiving phishing emails, security operations should know. 

These mechanisms need to be in place from the executive offices down to the sales or production floor, as reducing risk really is everyone’s responsibility. Moreover, the asset and patch management system needs a mechanism to prioritize patches based on business risk. Unless the IT team has the resources to deploy every single patch that comes their way, they will have to prioritize, and that prioritization needs to be based on the threat to business rather than arbitrary severity scores.

 An Investment 

There is no “one size fits all” solution for risk assessment and management. For example, for a restaurant that doesn’t accept reservations or orders online, a relatively insecure website doesn’t present much business risk. While it may be technically vulnerable, they are not at risk of losing valuable data...[…] Read more »….

 

Top 9 effective vulnerability management tips and tricks

The world is currently in a frenetic flux. With rising geopolitical tensions, an ever-present rise in cybercrime and continuous technological evolution, it can be difficult for security teams to maintain a straight bearing on what’s key to keeping their organization secure.

With the advent of the “Log4shell,” aka Log4J vulnerability, sound vulnerability management practices have jumped to the top of the list of skills needed to maintain an ideal state of cybersecurity. The impacts due to Log4j are expected to be fully realized throughout 2022.

As of 2021, missing security updates are a top-three security concern for organizations of all sizes — approximately one in five network-level vulnerabilities are associated with unpatched software.

Not only are attacks on the rise, but their financial impacts are as well. According to Cybersecurity Ventures, costs related to cybercrime are expected to balloon 15% year over year into 2025, totaling $11 trillion.

Vulnerability management best practices

Whether you’re performing vulnerability management for the first time or looking to revisit your current vulnerability management practices to find new perspectives or process efficiencies, there are some recommended useful strategies concerning vulnerability reduction.

Here are the top nine (We decided to just stop there!) tips and tricks for effective vulnerability management at your organization.

1. Vulnerability remediation is a long game

Extreme patience is required when it comes to vulnerability remediation. Your initial review of vulnerability counts, categories, and recommended remediations may instill a false sense of confidence: You may expect a large reduction after only a few meetings and executing a few patch activities. This is far from how reality will unfold.

Consider these factors as you begin initial vulnerability management efforts:

  • Take small steps: Incremental progress in reducing total vulnerabilities by severity should be the initial goal, not an unrealistic expectation of total elimination. The technology estate should ideally accumulate new vulnerabilities at a slightly lower pace versus what is remediated as the months and quarters roll on.
  • Patience is a virtue: Adopting a patient mindset is unequivocally necessary to avoid mental defeat, burnout and complacency. Remediation progress will be slow but must sustain a methodical approach.
  • Learn from challenges: As roadblocks are encountered, these serve as opportunities to approach alternate remediation strategies. Plan on what can be solved today or in the current week.

Avoid focusing on all the major problems preventing remediation and think with a growth mindset to overcome these challenges.

2. Cross-team collaboration is required

Achieving a large vulnerability reduction requires effective collaboration across technology teams. The high vulnerability counts across the IT estate likely exist due to several cultural and operational factors within the organization which pre-exists remediation efforts, including:

  • Insufficient staff to maintain effective vulnerability management processes
  • Legacy hardware that cannot be patched because they run on very expensive hardware — or provide a specific function that is cost-prohibitive to replace
  • Ineffective patching solutions that do not or cannot apply necessary updates completely (e.g., the solution can patch web browsers but not Java or Adobe)
  • Misguided beliefs that specialized classes of equipment cannot be patched or rebooted therefore, they are not revisited for extended periods

Part of your remediation efforts should focus on addressing systemic issues that have historically prevented effective vulnerability remediation while gaining support within or across the business to begin addressing existing vulnerabilities.

Determine how the various teams in your organization can serve as a force multiplier. For example, can the IT support desk or other technical teams assist directly in applying patches or decommissioning legacy devices? Can your vendors assist in applying patches or fine-tuning configurations of difficult to patch equipment to make?

These groups can assist in overall reduction while further plans are developed to address additional vulnerabilities.

3. Start by focusing on low-hanging fruit

Focus your initial efforts on the low-hanging fruit when building a plan to address vulnerabilities. Missing browser updates and applying updates to third-party browser software like Java or Adobe are likely to comprise the largest initial reduction efforts.

If software like Google Chrome or Firefox is missing the previous two years of security updates, it likely signifies the software is not being used. Some confirmation may be required, but the response is likely to remove software, not the application of patches.

To prevent a recurrence, there will likely be a need to revisit workstation and server imaging processes to determine if legacy, unapproved or unnecessary software is being installed as new devices are provisioned.

4. Leverage your end-users when needed

Don’t forget to leverage your end-users as a possible remediation vector. A single email you spend 30 minutes carefully crafting to include instructions on how they can self-update difficult-to-patch third-party applications can save you many hours of time and effort — compared to working with technical teams where the end result may be a reduction of fewer vulnerabilities.

However, end-user involvement should be an infrequent and short-term approach as the underlying problems outlined in cross-team collaboration (tip #2) are addressed.

This also provides an indirect approach to increasing security awareness via end-user engagement. Users are more likely to prioritize security when they are directly involved in the process.

5. Be prepared to get your hands dirty

Many of the vulnerabilities that exist will require a manual fix, including but not limited to:

  • Unquoted service paths in program directories
  • Weak or no passwords on periphery devices like printers
  • Updating SNMP community strings
  • Windows registry not set

While there is project downtime — or the security function is between remediation planning — focus on providing direct assistance where possible. A direct intervention provides an opportunity to learn more about the business and the people operating the technology in the environment. It also provides direct value when an automated process fails to remediate or cannot remediate identified vulnerabilities.

This may also be required when already stressed IT teams cannot assist in remediation activity.

6. Targeted patch applications can be effective for specific products

Some vulnerabilities may require the application of a specific update to address large numbers of vulnerabilities that automatic updates continuously fail to address. This is often seen in Microsoft security updates that did not apply completely or accurately for random months across several years and devices.

Search for and test the application of cumulative security updates. One targeted patch update may remediate dozens of vulnerabilities.

Once tested, use automated patch application tools like SCCM or remote management and monitoring (RMM) tools to stage and deploy the specific cumulative update.

7. Limit scan scope and schedules 

Vulnerability management seeks to identify and remediate vulnerabilities, not cause production downtime. Vulnerability scanning tools can unintentionally disrupt information systems and networks via the probing traffic generated towards organization devices or equipment.

Suppose an organization is onboarding a new scanning tool or spinning up a new vulnerability management practice. In that case, it is best to start scanning a small network subset that represents the asset types deployed across the network.

Over time, scanning can be rolled out to larger portions of the network as successful scanning activity on a smaller scale is consistently demonstrated.

8. Leverage analytics to focus remediation activity 

Native reporting functions provided by vulnerability scanning tools typically lack effective reporting functions that assist in value-add vulnerability reduction. Consider implementing programs like Power BI, which can help the organization focus on the following:

  • New vulnerabilities by type or category
  • Net new vulnerabilities
  • Risk severity ratings for groups of or individual vulnerabilities
9. Avoid overlooking compliance pitfalls or licensing issues

Ensure you fully understand any licensing requirements in relation to enterprise usage of third-party software and make plans to stay compliant.

As software evolves, its creators may look to harness new revenue streams, which have real-world impacts on vulnerability management efforts. A classic example is Java, which is highly prevalent in organizations across the globe. As of 2019, Java requires a paid license subscription to receive security updates for Java.

Should a third party decide to perform an onsite audit of the license usage, the company may find itself tackling a lawsuit on top of managing third-party software security updates…[…] Read more »….

 

Key Steps for Public Sector Agencies To Defend Against Ransomware Attacks

Over the past two years, the pandemic has fundamentally altered the business world and the modern work environment, leaving organizations scrambling to maintain productivity and keep operational efficiency intact while securing the flow of data across different networks (home and office). While this scenario has undoubtedly created new problems for businesses in terms of keeping sensitive data and IP safe, the “WFH shift” has opened up even greater risks and threat vectors for the US public sector.

Federal, state, local governments, education, healthcare, finance, and nonprofit organizations are all facing privacy and cybersecurity challenges the likes of which they’ve never seen before. Since March 2020, there’s been an astounding increase in the number of cyberattacks, high-profile ransomware incidents, and government security shortfalls. There are many more that go undetected or unreported. This is in part due to employees now accessing their computers and organization resources/applications from everywhere but the office, which is opening up new security threats for CISOs and IT teams.

Cyberthreats are expected to grow exponentially this year, particularly as the world faces geopolitical upheaval and international cyberwarfare. Whether it’s a smaller municipality or a local school system, no target is too small these days, and everyone is under attack due to bad actors now having more access to sophisticated automation tools.

The US public sector must be prepared to meet these new challenges and focus on shoring up vulnerable and critical technology infrastructures while implementing new cybersecurity and backup solutions that secure sensitive data.

Previous cyber protection challenges

As data volumes grow and methods of access change, safeguarding US public sector data, applications, and systems involves addressing complex and often competing considerations. Government agencies have focused on securing a perimeter around their networks, however, with a mobile workforce combined with the increase in devices, endpoints, and sophisticated threats, data is still extremely vulnerable. Hence the massive shift towards a Zero Trust model.

Today, there is an over-reliance on legacy and poorly integrated IT systems, leaving troves of hypersensitive constituent data vulnerable; government agencies have become increasingly appealing targets for cybercriminals. Many agencies still rely on outdated five-decade-old technology infrastructure and deal with a multitude of systems that need to interact with each other, which makes it even more challenging to lock down these systems. Critical infrastructure industries have more budget restraints than ever; they need flexible and affordable solutions to maintain business continuity and protect against system loss.

Protecting your organization’s data assets

The private sector, which owns and operates most US critical infrastructure, will continue being instrumental in helping government organizations (of all sizes) modernize their cyber defenses. The US continues to make strides in creating specific efforts that encourage cyber resilience and counter these emerging threats.

Agencies and US data centers must focus on solutions that attest to data protection frameworks like HIPAA, CJIS, NIST 800-171 first and then develop several key pillars for data protection built around the Zero Trust concept. This includes safety (ensuring organizational data, applications, and systems are always available), accessibility (allowing employees to access critical data anytime and anywhere), and privacy and authenticity (control who has access to your organization’s digital assets).

New cloud-based data backup, protection and cybersecurity solutions that are compliant to the appropriate frameworks and certified will enable agencies to maximize operational uptime, reduce the threat of ransomware, and ensure the highest levels of data security possible across all public sector computing environments.

Conclusion

First and foremost, the public sector and US data centers must prioritize using compliant and certified services to ensure that specific criteria are met…[…] Read more »

 

What Digital Transformation Truly Means, with Srini Alagarsamy

Searching for Digital Transformation on Google fetches over five hundred million results, a good number of which aim to define the term. Following are my thoughts from a vantage point of having strategized and executed digital transformations in leading organizations. Over history, firms have had to transform many times, from the invention of money to the advent of electricity, and through the industrial, railroad, communication, and internet revolutions. Generally, transformations have occurred roughly every 50 years, and thus transformation is an iterative process; a journey that firms must embrace and understand that it is about continually getting better and not an end state. While few firms help shape new consumer preference categories, a vast majority must adjust, each time, reimagining their business model around those newly formed preferences – designing products, interactions, business processes all revolving around the customer.

Digital transformation is the latest iteration of business transformation, with firms adapting to new consumer preferences focused on digital channels. These preferences in the past two decades have largely tended to be at-the-glass (mobile, web) experiences. While digital twins have existed for some time, there is still a marked separation between the physical and virtual worlds. With the advent of Web 3.0 and Spatial Web, we are entering a different era of experience where these boundaries will continue to get blurred. Consumer preferences will shift from at-the-glass to inside-the-jar experiences.

In many firms, digital transformation conversations begin with discussions around Cloud, Agile, DevOps, AI, ML, Data Science, etc. While these are key building blocks, they are only a means to an end. They are the How, not the Why or the What. This is akin to picking up the hammer before knowing where the nails are, and in my humble opinion needs to change. Every company that aims to drive digital transformation ought to ask Why they need to change and What the customer will gain as part of that change. If they choose to go directly to the How they could call their efforts digitization, or digitalization, not digital transformation.

To truly embrace digital transformation, deliberate analysis of consumer needs, planning, and execution is important.

 

Start with Why:

Every firm must ask these questions before embarking on a digital transformation:

  •     Will this create a fundamental shift in customer experience for the good?
  •     Will this create net new opportunities for the firm or its customers?
  •     Will this create significant operational efficiencies for the firm?
  •     Will this create marketplace differentiation?

 

Define the What:

  •     Based on the Why, create the manifestations that reach customers best. These could be products, platforms, services, experiences.

 

Apply the How:

What must the firm do to bring the What to life effectively and efficiently? Some objective questions to ask here:

  •     Which enablers will get us there? Agile, Cloud, DevOps?
  •     Is a cultural transformation needed before you digitally transform?
  •     Should we build, buy, partner, or a combination of the three?

 

In summary, transformation is a continuous journey, and successful firms will constantly transform themselves in a way that will fundamentally alter the value they offer their customers. If done well, transformation should not feel like a project or a program, it’s business as usual. My suggestion is to start with the Why.

 

Srini Alagarsamy | Vice President, Digital Software Solutions at GM Financial

I am a technologist both by passion and profession. My first interaction with computers was in my late teens but I soon fell in love with developing software and experimenting with hardware. Professionally, I have been fortunate to be part of world-class organizations, driving major business and digital transformation initiatives. Through this blog, I intend to share perspectives I have gained and lessons I have learned about business and leadership.

Finding the right MSSP for securing your business and training employees

Over the past year, small businesses have had to navigate the pandemic’s many challenges — from changes in business models and supply shortages to hiring and retaining employees. On top of these pandemic-driven challenges, SMBs also faced a growing business risk: cybersecurity incidents.

Cybercriminals often target SMBs due to the limited security resources and training that leave these businesses vulnerable. According to a study, Verizon found 61% of all SMBs reported at least one cyberattack during 2020, with 93% of small business attacks focused on monetary gain. Unfortunately, this leaves many SMBs forced to close after an incident due to the high costs incurred during a cyberattack.

Cybersecurity is no longer just “nice to have” for SMBs, but many business owners don’t know where to start. And while measures like a VPN or antivirus system can help, they aren’t enough by themselves. Managed security service providers (MSSPs) are a valuable resource for SMBs, allowing them to bring in the expertise needed to secure infrastructure that they might not be able to afford in this highly competitive labor market.

When looking for an MSSP, hundreds of options often leave businesses overwhelmed. To learn more about the value MSSPs should and can bring to the table, I spoke with Frank Rauch and Shay Solomon at Check Point Software Technologies.

Koziol: What should small and medium business owners look for when selecting a cybersecurity MSSP? What are the must-haves and the nice-to-haves?

Rauch: We are living in a time where businesses, SMBs especially, cannot afford to leave their security to chance. SMBs are a prime target for cybercriminals, as SMBs inherently struggle with the expertise, resources and IT budget needed to protect against today’s sophisticated cyberattacks. We are now experiencing the fifth generation of cyberattacks: large-scale, multi-vector, mega attacks targeting businesses, individuals and countries. SMBs should be looking for a true leader in cybersecurity. They should partner with an MSSP that can cover all customer sizes and all use cases. To make it easy, we can focus on three key areas:

  1. Security. The best MSSPs have security solutions that are validated by renowned third parties. They should prove their threat prevention capabilities and leverage a vast threat intelligence database that can help prevent threats at a moment’s notice.
  2. Capabilities. MSSPs should be offering a broad set of solutions, no matter the size—from large enterprises to small businesses, data centers, mobile, cloud, SD-WAN protection, all the way to IoT security. Having this broad range of expertise will ensure that your MSSP is ready to cover your business in all instances.
  3. Individualized. This may be one of the most critical areas. Your MSSP should be offering flexible growth-based financial models and provide service and support 24/7 with real-time prevention. Collaborative business processes and principles will ensure success and security in the long run.

Koziol: How can SMBs measure the value of bringing in an MSSP? Or, the risks of inaction?

Rauch: The biggest tell-tale sign of a match made in heaven is if you’re receiving your security needs through one single vendor. If not, those options are out there! Getting the best security through one experienced, leading vendor can reduce costs, simplify, support and ensure consistency across all products. This ranges from simply protecting your sensitive data all the way to ensuring you can secure the business through a centralized security management platform. How can you protect what you can’t see?

It makes sense to keep an eye on how many cybersecurity attacks you’re preventing each month. How long is it taking you to create, change and manage your policies? Are you scaling to your liking? Can you adapt on the fly if need be? Are your connected devices secure? These are just some examples that you should be able to measure with simplicity.

Koziol: How has the shift in remote/hybrid workforce changed how cybersecurity MSSPs support SMBs?

Rauch: The shift to a larger work-from-home practice has caused attackers to shift their attacks outside of their network. It is more important now than ever for MSSPs to be providing their SMBs with a complete portfolio — endpoint, mobile, cloud, email and office — that allows them to connect reliably, scale rapidly and stay protected, no matter the environment.

The best MSSPs should have been ready for this day. At any moment, day or night, your organization can be victimized by devastating cybercrime. You can’t predict when cyberattacks will happen, but you can use proactive practices and security services to quickly mitigate their effects or prevent them altogether. The shift to a hybrid workforce exposed the holes in the existing security infrastructure.

On the bright side, security incidents present an opportunity to comprehensively reevaluate and improve information security programs. They show threat vectors that we previously overlooked and raise awareness across the organization to enhance existing or implement new controls. So at the very least, this shift has been an eye-opener for MSSPs.

Koziol: Should MSSPs offer security awareness and training as part of their offering? Why?

Solomon: Absolutely, yes. At the end of the day, knowledge is power. Cyberattacks are evolving and training can help keep SMB employees protected and educated. According to a study from VIPRE, 47% of SMBs leaders reported keeping data secure as their top concern. At the same time, many SMBs lack sufficient skills and capacity to drive improved security on their own.

The only way to fight cybercrime effectively is by sharing experiences and knowledge. Due to the cyber shortage, Check Point Software, along with 200 global training partners, recently announced a free cybersecurity training program called Check Point Mind. It offers many training and cybersecurity awareness programs to give SMBs (or any business) the chance to extend their skills with comprehensive cybersecurity training programs led by world-class professionals.

Koziol: How can working with an MSSP on security awareness education improve a business’s overall security posture?

Solomon: Raising awareness with employees is a crucial step that’s often overlooked. Employees need to be able to identify a phishing attempt and know how to react. In our experience, we see a majority of employees attacked using emails. They receive an email that looks like an official email from someone with authority, asking them to open attachments or click on a link that contains malicious intent.

If employees go through a training course that teaches them what to look for in an attack, this will surely reduce the chance of that employee falling victim to the phishing attempt.

Koziol: What questions should SMBs be asking their current or future MSSPs about cybersecurity?

Solomon: Building on what was mentioned earlier, it is never too late to reevaluate and improve information security programs. Asking questions and investing in a better security posture shows us threat vectors that we previously might have overlooked and raises awareness across the organization to the need to improve existing or implement new controls. SMBs must proactively approach their MSSPs to ensure they are getting the best bang for their buck—security solutions that require minimal configuration and simple onboarding. In addition, they need to ensure they are taking the proper steps when evaluating security architecture, advanced threat prevention, endpoint, mobile, cloud, email and office.

Koziol: What’s ahead for MSSPs in the cybersecurity space? What should SMB owners expect to see next?

Rauch: One of the key areas we’ll see continuously growing is the need for a next-generation cybersecurity solution that enables organizations to proactively protect themselves against cyberthreats: incident detection and response management. As attacks continue to evolve and grow in numbers, unified visibility is a must-have across multiple vectors that a cyberthreat actor could use to attack a network.

A common challenge we see is an overwhelming volume of security data generated by an array of stand-alone point security solutions. What’s needed is a single dashboard, or, in other words, unified visibility, that enables a lean security team to maximize their efficiency and effectiveness. SMBs should take the opportunity to check security investments. The highest level of visibility, reached through consolidation, will guarantee the best effectiveness…[…] Read more »….

 

Talent Shortage: Are Universities Delivering Well-Prepared IT Graduates?

The tech talent crunch is impacting organizations of all sizes. The lack of qualified IT specialists is a rising concern with no end in sight.

Taking into account accelerated retirement plans of the Baby Boomers and the “great resignation” spurred by the pandemic, tech companies will be even more reliant on the upcoming generation of university graduates to fill the ranks of data specialists, AI experts, and software engineering pros.

Josh Drew, Boston regional director at Robert Half, a staffing and talent solutions company, says he regularly sees first-year computer science graduates take job opportunities in the development space with salaries ranging from $90,000 to north of $100,000.

“If you look at the opportunities of coming directly out of school and the skillset they leave with, I think there is a clear indication the university formula is working,” he says.

He added that the pandemic, which forced almost all university students into a totally virtual learning space, has also prepared them for the more flexible, part-time remote work.

“They’ve been doing online classes instead of turning their assignments into teachers — they’re uploading it and hosting it on sites, sharing through Google or Slack,” Drew says. “The model fits well with the hybrid workplace in the sense that it’s not always on-site turning in hard work — it’s working in a virtual world or outside of the classroom.”

Changing Tech Landscape

However, there is some concern that the most in-demand skills are not being taught and that they aren’t providing graduates with soft skills like communications.

Catherine Southard, vice president of engineering at D2iQ, says her company hasn’t had much success finding new grads with experience in Kubernetes and the Go programming language, in which D2iQ’s product is primarily developed.

“Part of that is because the tech landscape changes so quickly. It would be great for a representative from tech companies — maybe a panel of CTOs — to sit down with curriculum developers every couple of years and talk through industry trends and where technology is headed, and then brainstorm how to bridge the gap between university and industry,” she says.

Southard added something students can do is research jobs that look interesting, then see what tech stack those companies are using. They can then equip themselves to land those jobs by studying up on that technology by using free resources online or taking courses.

Importance of Internship Programs

She sees another area of improvement in support for internship programs. Historically, D2iQ had a program in the US, but it was expensive to operate, and it didn’t lead to long-term employee retention, except for a couple of stand-out talents.

She noted larger tech companies can sponsor internship programs, but for startups, Southard would like to see universities splitting some of the operating costs as an investment in their students.

“We have had success hiring student workers in our German office, and that is an excellent setup for all involved,” she says. “We get smart, motivated students, the students get real-world experience, and our engineers can focus on more challenging problems as the students are able to perform more basic tasks.”

She explained a lot of people looking to change careers participate in code camps lasting a couple of months. These camps give them the necessary skills to hit the ground running as developers: Universities might do well to look at what those programs are doing and create a similar curriculum.

“A four-year degree is great, and there are lots of benefits to it, but it’s really not required anymore to be a developer,” Southard points out. “Universities should make sure their graduating students are as immediately employable as code camp graduates.”

Code Camps

Drew pointed out that in the Boston area, he’s seen the growth of these code camps, as well as different academies and even school contests geared around topics like ethical hacking.

“More than ever, the curriculum within the IT space is definitely like real-world applications,” he says.

He’s seen the development of e-commerce and entrepreneurial kind of programs where students are building and developing products to sell on websites.

“They’re bridging the gap with soft skills and teamwork, collaborating with others and often in a virtual environment,” Drew says.

Southard also notes that most physical science students will only have a few required English courses, and no communication courses, but success in the workplace will come down to their attitude and their abilities to collaboratively solve problems and communicate clearly.

“If universities could have a peer review process on programming assignments using standard industry tooling such as Github, that would help build some of these skills and better prepare students for their career,” Southard says.

From the perspective of Kevin Chandra, the Gen-Z co-founder and CEO of Typedream, his university experience at University of Southern California adequately prepared him for a career in the real world.

“Our universities teach us the fundamentals of computer science; the reason that they do this is because technologies change very quickly,” he says. “If universities were to teach industry-standard technologies, by the time you graduate it will all have changed.”

Exposure to Tech Community

Chandra says what he thinks is currently lacking in universities is providing IT students with more exposure to the tech community.

“I have learned from Twitter, Substack blogs, and podcasts about relevant trends, technologies, and marketing strategies much more than I could ever have from outdated books,” he says. “I wish universities invited thought leaders to lecture students at universities.”

Mohit Tiwari, co-founder and CEO at Symmetry Systems, agreed with Chandra that universities excel at teaching young IT professionals the type of fundamental technical skill sets that can set students up for decades.

“For example, students trained in programming languages, distributed systems, and data engineering can now work on critical infrastructure problems like privacy and cloud security,” he says.

More broadly, universities are also a critical staging area before the students are launched into production.

“The goal is providing those students with a safe place to make mistakes and learn in a cohort with mentors to help, and not to load them with every skill they will need for 30 years,” Tiwari added.

Tiwari says he thought tech companies could do a better reaching out to IT students or form relationships with higher education institutions by creating open-source testbeds that reflect real-world deployments that students can use as projects.

Chandra says he felt big tech companies, especially in the US, do a good job reaching out to university grads by providing internship opportunities starting from freshman year..[…] Read more »…..

 

Is a Merger Between Information Security and Data Governance Imminent?

As with any merger, it is always difficult to predict an outcome until the final deal papers are signed, and press releases hit the wires.  However, there are clear indications that a tie up between these two is essential, and we will all be the better for it.  Data Governance has historically focused on the use of data in a business, legal and compliance context rather than how it should be protected, while the opposite is true for Information Security.

The idea of interweaving Data Governance and Information Security is not entirely new.  Gartner discussed this in their Data Security Governance Model, EDRM integrated multiple stakeholders including Information Security, Privacy, Legal and Risk into an overarching Unified Data Governance model, and an integrated approach to Governance, Risk, and Compliance has long been an aspiration in the eGRC market.  Organizations that have more mature programs are likely to have some level of integration between these functions already, but many continue to struggle with the idea and often treat them as separate, siloed programs.

As programs go, Information Security is ahead of Data Governance for its level of attention in the Boardroom; brought about primarily by news-worthy events that demonstrated what security and privacy practitioners had warning about for a long time.   These critial risks to the public and private sectors inspired significant, sweeping frameworks and industry standards(PCI, NIST, ISO, ISACA, SOC2) and regulatory legislation (HIPAA, GDPR, NYDS), and gave Information Security Officers (CISOs) a platform for change.

By contrast, data governance has been more fragmented in its definition, organization, development, and funding.  Many organizations accept the value of data governance, particularly as a proactive means to minimize risk, while enabling expansive use of information required in today’s business environment.   However, enterprises still struggle to balance information risk and value, and establishing the right enablers and controls.

Drivers

Risks and affirmative obligations associated with information are the primary drivers for the intersection of data governance and information security.  The reason that information security is so critical is that the loss ((through exfiltration or loss of access due to ransomware) of certain types of data carry legal and compliance consequences, along with impacting normal business operations.  And a lack of effective legal and compliance controls often lead to increased information security and privacy risk.

Additional common drivers include:

  • Volume, velocity, mobility, and sensitivity of information
  • Volume and complexity of legal, compliance, and privacy requirements
  • Hybrid technology and business environments
  • Multinational governance models and operations
  • Headline and business interruption risks

Finally, an underlying driver is the need to leverage investments in technology, practices, and personnel across an organization.  The interrelationships of so many information requirements simply demands a more coordinated approach.

Merging the models

We chose Information Risk Management, to define a construct that encompasses the overaching disciplines and requirements.  First, we did so because it places the focus on information.  For example, the same piece of information that requires protection, may also have retention and discovery requirements.  Second, risk management recognizes the need to balance the value and use of information from a business perspective, while also providing appropriate governance or protection.  Risk management also serves as an important means to evaluate priorities in investment, resources, and audit functions.

Information Risk Management
Figure 1: Information Risk Management

The primary objective is to integrate processes, people, and solutions into a framework that addresses common requirements; and does so “in depth” for both.  Security people, practices and technologies have long-been deployed at many levels (in-depth) to protect the organization.  The same has not often been the case for governance (legal, compliance, and privacy) obligations.  New practices and technologies are enablers for ntersecting programs, and support alignment amongst key constituencies, including Information Security, IT, Legal, Privacy, Risk and Compliance.  Done right, this provides leverage in an organization’s human and technology investments, improves risk posture, and increases the rate and reach of new practices and solutions.

Meshing disciplines and elements of each program are not meant as a new organizational construct; rather, it should start with a firm understanding of information requirements from key stakeholders; and from there establish synergies.  The list below, not meant to be exclusive, provides examples of shared enabling practices and technologies:

Shared Enablers and Requirements
Figure 2: Shared Enablers and Requirements

Conclusion

Integrating data governance, information security and privacy frameworks allows an enterprise to gain leverage from areas of common investment and provides a more comprehensive enterprise risk management strategy.  By improving proactive information management, organizations increase preventative control effectiveness and decrease reliance on detection and response activities.  It also develops cross functional capabilities across Privacy, Legal, Compliance, IT, and Information Security…[…] Read more »

 

 

Top 15 cybersecurity predictions for 2022

Over the past several years, cybersecurity risk management has become top of mind for boards. And rightly so. Given the onslaught of ransomware attacks and data breaches that organizations experienced in recent years, board members have increasingly realized how vulnerable they are.

This year, in particular, the public was directly impacted by ransomware attacks, from gasoline shortages, to meat supply, and even worse, hospitals and patients that rely on life-saving systems. The attacks reflected the continued expansion of cyber-physical systems — all of which present new challenges for organizations and opportunities for threat actors to exploit.

There should be a shared sense of urgency about staying on top of the battle against cyberattacks. Security columnist and Vice President and Ambassador-At-Large in Cylance’s Office of Security & Trust, John McClurg, in his latest Cyber Tactics column, explained it best: “It’s up to everyone in the cybersecurity community to ensure smart, strong defenses are in place in the coming year to protect against those threats.”

As you build your strategic planning, priorities and roadmap for the year ahead, security and risk experts offer the following cybersecurity predictions for 2022.

Prediction #1: Increased Scrutiny on Software Supply Chain Security, by John Hellickson, Cyber Executive Advisor, Coalfire

“As part of the executive order to improve the nation’s cybersecurity previously mentioned, one area of focus is the need to enhance software supply chain security. There are many aspects included that most would consider industry best practice of a robust DevSecOps program, but one area that will see increased scrutiny is providing the purchaser, the government in this example, a software bill of materials. This would be a complete list of all software components leveraged within the software solution, along with where it comes from. The expectation is that everything that is used within or can affect your software, such as open source, is understood, versions tracked, scrutinized for security issues and risks, assessed for vulnerabilities, and monitored, just as you do with any in-house developed code. This will impact organizations that both consume and those that deliver software services. Considering this can be very manual and time-consuming, we could expect that Third-Party Risk Management teams will likely play a key role in developing programs to track and assess software supply chain security, especially considering they are usually the front line team who also receives inbound security questionnaires from their business partners.”

 

Prediction #2: Security at the Edge Will Become Central, by Wendy Frank, Cyber 5G Leader, Deloitte

 

“As the Internet of Things (IoT) devices proliferate, it’s key to build security into the design of new connected devices themselves, as well as the artificial intelligence (AI) and machine learning (ML) running on them (e.g., tinyML). Taking a cyber-aware approach will also be crucial as some organizations begin using 5G bandwidth, which will drive up both the number of IoT devices in the world and attack surface sizes for IoT device users and producers, as well as the myriad networks to which they connect and supply chains through which they move.”

 

Prediction #3: Boards of Directors will Drive the Need to Elevate the Chief Information Security Officer (CISO) Role, by Hellickson

 

“In 2021, there was much more media awareness and senior executive awareness about the impacts of large cyberattacks and ransomware that brought many organizations to their knees. These high-profile attacks have elevated the cybersecurity conversations in the Board room across many different industries. This has reinforced the need for CISOs to be constantly on top of current threats while maintaining an agile but robust security strategy that also enables the business to achieve revenue and growth targets. With recent surveys, we are seeing a shift in CISO reporting structures moving up the chain, out from underneath the CIO or the infrastructure team, which has been commonplace for many years, now directly to the CEO. The ability to speak fluent threat & risk management applicable to the business is table stakes for any executive with cybersecurity & board reporting responsibilities. This elevated role will require a cybersecurity program strategy that extends beyond the standard industry frameworks and IT speak, and instead demonstrate how the cybersecurity program is threat aware while being aligned to each executive team’s business objectives that demonstrates positive business and cybersecurity outcomes. More CISOs will look for executive coaches and trusted business partners to help them overcome any weaknesses in this area.”

 

Prediction #4: Increase of Nation-State Attacks and Threats, by John Bambenek, Principal Threat Researcher at Netenrich

 

“Recent years have seen cyberattacks large and small conducted by state and non-state actors alike. State actors organize and fund these operations to achieve geopolitical objectives and seek to avoid attribution wherever possible. Non-state actors, however, often seek notoriety in addition to the typical monetary rewards. Both actors are part of a larger, more nebulous ecosystem of brokers that provides information, access, and financial channels for those willing to pay. Rising geopolitical tensions, increased access to cryptocurrencies and dark money, and general instability due to the pandemic will contribute to a continued rise in cyber threats in 2022 for nearly every industry. Top-down efforts, such as sanctions by the U.S. Treasury Department, may lead to arrests but will ultimately push these groups further underground and out of reach.”

 

And, Adversaries Outside of Russia Will Cause Problems

 

Recognizing that Russia is a safe harbor for ransomware attackers, Dmitri Alperovitch, Chairman, Silverado Policy Accelerator: “Adversaries in other countries, particularly North Korea, are watching this very closely. We are going to see an explosion of ransomware coming from DPRK and possibly Iran over the next 12 months.”

 

Ed Skoudis, President, SANS Technology Institute: “What’s concerning about this potential reality is that these other countries will have less practice at it, making it more likely that they will accidentally make mistakes. A little less experience, a little less finesse. I do think we are probably going to see — maybe accidentally or maybe on purpose — a significant ransomware attack that might bring down a federal government agency and its ability to execute its mission.”

 

Prediction #5: The Adoption of 5G Will Drive The Use Of Edge Computing Even Further, by Theresa Lanowitz, Head of Evangelism at AT&T Cybersecurity

 

“While in previous years, information security was the focus and CISOs were the norm, we’re moving to a new cybersecurity world. In this era, the role of the CISO expands to a CSO (Chief Security Officer) with the advent of 5G networks and edge computing.

The edge is in many locations — a smart city, a farm, a car, a home, an operating room, a wearable, or a medical device implanted in the body. We are seeing a new generation of computing with new networks, new architectures, new use cases, new applications/applets, and of course, new security requirements and risks.

While 5G adoption accelerated in 2021, in 2022, we will see 5G go from new technology to a business enabler. While the impact of 5G on new ecosystems, devices, applications, and use cases ranging from automatic mobile device charging to streaming, 5G will also benefit from the adoption of edge computing due to the convenience it brings. We’re moving away from the traditional information security approach to securing edge computing. With this shift to the edge, we will see more data from more devices, which will lead to the need for stronger data security.

 

Prediction #6: Continued Rise in Ransomware, by Lanowitz

 

“The year 2021 was the year the adversary refined their business model. With the shift to hybrid work, we have witnessed an increase in security vulnerabilities leading to unique attacks on networks and applications. In 2022, ransomware will continue to be a significant threat. Ransomware attacks are more understood and more real as a result of the attacks executed in 2021. Ransomware gangs have refined their business models through the use of Ransomware as a Service and are more aggressive in negotiations by doubling down with distributed denial-of-service (DDoS) attacks. The further convergence of IT and Operational Technology (OT) may cause more security issues and lead to a rise in ransomware attacks if proper cybersecurity hygiene isn’t followed.

While many employees are bringing their cyber skills and learnings from the workplace into their home environment, in 2022, we will see more cyber hygiene education. This awareness and education will help instill good habits and generate further awareness of what people should and shouldn’t click on, download, or explore.”

 

Prediction #6: How the Cyber Workforce Will Continue to be Revolutionized Among Ongoing Shortage of Employees, by Jon Check, Senior Director Of Cyber Protection Solutions at Raytheon Intelligence & Space

 

“Moving into 2022, the cybersecurity industry will continue to be impacted by an extreme shortage of employees. With that said, there will be unique advantages when facing the current so-called ‘Great Resignation’ that is affecting the entire workforce as a whole. As the industry continues to advocate for hiring individuals outside of the cyber industry, there is a growing number of individuals looking to leave their current jobs for new challenges and opportunities to expand their skills and potentially have the choice to work from anywhere. While these individuals will still need to be trained, there is extreme value in considering those who may not have the most perfect resume for the cyber jobs we’re hiring for, but may have a unique point of view on solving the next cyber challenge. This expansion will, of course, increase the importance of a positive work culture as such candidates will have a lot of choices of the direction they take within the cyber workforce — a workforce that is already competing against the same pool of talent. With that said, we will never be able to hire all the cyber people we need, so in 2022, there will be a heavier reliance on automation to help fulfill those positions that continue to remain vacant.”

 

Prediction #7: Expect Heightened Security around the 2022 Election Cycle, by Jadee Hanson CIO and CISO of Code42

 

“With multiple contentious and high-profile midterm elections coming up in 2022, cybersecurity will be a top priority for local and state governments. While security protections were in place to protect the 2020 election, publicized conversations surrounding the uncertainty of its security will facilitate heightened awareness around every aspect of voting next year.”

 

Prediction #8: A Shift to Zero Trust, by Brent Johnson, CISO at Bluefin

 

“As the office workspace model continues to shift to a more hybrid and full-time remote architecture, the traditional network design and implicit trust granted to users or devices based on network or system location are becoming a thing of the past. While the security industry had already begun its shift to the more secure zero-trust model (where anything and everything must be verified before connecting to systems and resources), the increased use of mobile devices, bring your own device (BYOD), and cloud service providers has accelerated this move. Enterprises can no longer rely on a specific device or location to grant access.

Encryption technology is obviously used as part of verifying identity within the zero-trust model, and another important aspect is to devalue sensitive information across an enterprise through tokenization or encryption. When sensitive data is devalued, it becomes essentially meaningless across all networks and devices. This is very helpful in limiting security practitioners’ area of concern and allows for designing specific micro-segmented areas where only verified and authorized users/resources may access the detokenized, or decrypted, values. As opposed to trying to track implicit trust relationships across networks, micro-segmented areas are much easier to lock down and enforce granular identity verification controls in line with the zero-trust model.”

 

 

Prediction #9: Securing Data with Third-Party Vendors in Mind Will Be Critical, by Bindu Sundareason, Director at AT&T Cybersecurity

 

Attacks via third parties are increasing every year as reliance on third-party vendors continues to grow. Organizations must prioritize the assessment of top-tier vendors, evaluating their network access, security procedures, and interactions with the business. Unfortunately, many operational obstacles will make this assessment difficult, including a lack of resources, increased organizational costs, and insufficient processes. The lack of up-to-date risk visibility on current third-party ecosystems will lead to loss of productivity, monetary damages, and damage to brand reputation.”

 

Prediction #10: Increased Privacy Laws and Regulation, by Kevin Dunne, President of Pathlock

 

“In 2022, we will continue to see jurisdictions pass further privacy laws to catch up with the states like California, Colorado and Virginia, who have recently passed bills of their own. As companies look to navigate the sea of privacy regulations, there will be an increasing need to be able to provide a real-time, comprehensive view of what data is being processed and stored, who can access it, and most importantly, who has accessed it and when. As the number of distinct regulations continues to grow, the pressure on organizations to put in place automated, proactive data governance will increase.”

 

Prediction #11: Cryptocurrency to Get Regulated, by Joseph Carson, Chief Security Scientist and Advisory CISO at ThycoticCentrify

 

“Cryptocurrencies are surely here to stay and will continue to disrupt the financial industry, but they must evolve to become a stable method for transactions and accelerate adoption. Some countries have taken a stance that energy consumption is creating a negative impact and therefore facing decisions to either ban or regulate cryptocurrency mining. Meanwhile, several countries have seen cryptocurrencies as a way to differentiate their economies to become more competitive in the tech industry and persuade investment. In 2022, more countries will look at how they can embrace cryptocurrencies while also creating more stabilization, and increased regulation is only a matter of time. Stabilization will accelerate adoption, but the big question is how the value of cryptocurrencies will be measured.  How many decimals will be the limit?”

 

Prediction #12: Application Security in Focus, by Michael Isbitski, Technical Evangelist at Salt Security

 

“According to the Salt Labs State of application programming interface (API) Security Report, Q3 2021, there was a 348% increase in API attacks in the first half of 2021 alone and that number is only set to go up.

With so much at stake, 2022 will witness a major push from nonsecurity and security teams towards the integration of security services and automation in the form of machine assistance to mitigate issues that arise from the rising threat landscape. The industry is beginning to understand that by taking a strategic approach to API security as opposed to a subcomponent of other security domains, organizations can more effectively align their technology, people, and security processes to harden their APIs against attacks. Organizations need to identify and determine their current level of API maturity and integrate processes for development, security, and operations in accordance; complete, comprehensive API security requires a strategic approach where all work in synergy.

To mitigate potential threats and system vulnerabilities, further industry-wide recognition of a comprehensive approach to API security is key. Next year, we anticipate that more organizations will see the need for and adopt solutions that offer a full life cycle approach to identifying and protecting APIs and the data they expose. This will require a significant change in mindset, moving away from the outdated practices of proxy-based web application firewalls (WAFs) or API gateways for runtime protection, as well as scanning code with tools that do not provide satisfactory coverage and leave business logic unaddressed. As we’ve already begun to witness, security teams will now focus on accounting for unique business logic in application source code as well as misconfigurations or misimplementations within their infrastructure that could lead to API vulnerabilities.

Implementing intelligent capabilities for behavior analysis and anomaly detection is also another way organizations can improve their API security posture in 2022. Anomaly detection is essential for satisfying increasingly strong API security requirements and defending against well-known, emerging and unknown threats. Implementing solutions that effectively utilize AI and ML can help organizations ensure visibility and monitoring capabilities into all the data and systems that APIs and API consumers touch. Such capabilities also help mitigate any manual mistakes that inadvertently create security gaps and could impact business uptime.”

 

Prediction #13: Disinformation on Social Media, by Jonathan Reiber, Senior Director of Cybersecurity Strategy and Policy at AttackIQ

 

“Over the last two years, pressure rose in Congress and the executive branch to regulate Section 230 and increased following the disclosures made by Frances Haugen, a former Facebook data scientist, who came forward with evidence of widespread deception related to Facebook’s management of hate speech and misinformation on its platform. Concurrent to those disclosures, in mid-November, the Aspen Institute’s Commission on Information Disorder published the findings of a major report, painting a picture of the United States as a country in a crisis of trust and truth, and highlighting the outsize role of social media companies in shaping public discourse. Building on Haugen’s testimony, the Aspen Institute report, and findings from the House of Representatives Select Committee investigating the January 6, 2021 attack on the U.S. Capitol, we should anticipate increasing regulatory pressure from Congress. Social media companies will likely continue to spend large sums of money on lobbying efforts to shape the legislative agenda to their advantage.”

 

Prediction #14: Ransomware To Impact Cyber Insurance, by Jason Rebholz, CISO at Corvus Insurance

 

“Ransomware is the defining force in cyber risk in 2021 and will likely continue to be in 2022. While ransomware has gained traction over the years, it jumped to the forefront of the news this year with high-profile attacks that impacted the day-to-day lives of millions of people. The increased visibility brought a positive shift in the security posture of businesses looking to avoid being the next news headline. We’re starting to see the proactive efforts of shoring up IT resilience and security defenses pay off, and my hope is that this positive trend will continue. When comparing Q3 2020 to Q3 2021, the ratio of ransoms demanded to ransoms paid is steadily declining, as payments shrank from 44% to 12%, respectively, due to improved backup processes and greater preparedness. Decreasing the need to pay a ransom to restore data is the first step in disrupting the cash machine that is ransomware. Although we cannot say for certain, in 2022, we can likely expect to see threat actors pivot their ransomware strategies. Attackers are nimble — and although they’ve had a ‘playbook’ over the past couple years, thanks to widespread crackdowns on their current strategies, we expect things to shift. We have already seen the opening moves from threat actors. In a shift from a single group managing the full attack life cycle, specialized groups have formed to gain access into companies who then sell that access to ransomware operators. As threat actors specialize in access into environments, it opens the opportunity for other extortion-based attacks such as data theft or account lockouts, all of which don’t require data encryption. The potential for these shifts will call for a great need in heavier investments in emerging tactics and trends to remove that volatility.”..[…] Read more »….