Top 9 effective vulnerability management tips and tricks

The world is currently in a frenetic flux. With rising geopolitical tensions, an ever-present rise in cybercrime and continuous technological evolution, it can be difficult for security teams to maintain a straight bearing on what’s key to keeping their organization secure.

With the advent of the “Log4shell,” aka Log4J vulnerability, sound vulnerability management practices have jumped to the top of the list of skills needed to maintain an ideal state of cybersecurity. The impacts due to Log4j are expected to be fully realized throughout 2022.

As of 2021, missing security updates are a top-three security concern for organizations of all sizes — approximately one in five network-level vulnerabilities are associated with unpatched software.

Not only are attacks on the rise, but their financial impacts are as well. According to Cybersecurity Ventures, costs related to cybercrime are expected to balloon 15% year over year into 2025, totaling $11 trillion.

Vulnerability management best practices

Whether you’re performing vulnerability management for the first time or looking to revisit your current vulnerability management practices to find new perspectives or process efficiencies, there are some recommended useful strategies concerning vulnerability reduction.

Here are the top nine (We decided to just stop there!) tips and tricks for effective vulnerability management at your organization.

1. Vulnerability remediation is a long game

Extreme patience is required when it comes to vulnerability remediation. Your initial review of vulnerability counts, categories, and recommended remediations may instill a false sense of confidence: You may expect a large reduction after only a few meetings and executing a few patch activities. This is far from how reality will unfold.

Consider these factors as you begin initial vulnerability management efforts:

  • Take small steps: Incremental progress in reducing total vulnerabilities by severity should be the initial goal, not an unrealistic expectation of total elimination. The technology estate should ideally accumulate new vulnerabilities at a slightly lower pace versus what is remediated as the months and quarters roll on.
  • Patience is a virtue: Adopting a patient mindset is unequivocally necessary to avoid mental defeat, burnout and complacency. Remediation progress will be slow but must sustain a methodical approach.
  • Learn from challenges: As roadblocks are encountered, these serve as opportunities to approach alternate remediation strategies. Plan on what can be solved today or in the current week.

Avoid focusing on all the major problems preventing remediation and think with a growth mindset to overcome these challenges.

2. Cross-team collaboration is required

Achieving a large vulnerability reduction requires effective collaboration across technology teams. The high vulnerability counts across the IT estate likely exist due to several cultural and operational factors within the organization which pre-exists remediation efforts, including:

  • Insufficient staff to maintain effective vulnerability management processes
  • Legacy hardware that cannot be patched because they run on very expensive hardware — or provide a specific function that is cost-prohibitive to replace
  • Ineffective patching solutions that do not or cannot apply necessary updates completely (e.g., the solution can patch web browsers but not Java or Adobe)
  • Misguided beliefs that specialized classes of equipment cannot be patched or rebooted therefore, they are not revisited for extended periods

Part of your remediation efforts should focus on addressing systemic issues that have historically prevented effective vulnerability remediation while gaining support within or across the business to begin addressing existing vulnerabilities.

Determine how the various teams in your organization can serve as a force multiplier. For example, can the IT support desk or other technical teams assist directly in applying patches or decommissioning legacy devices? Can your vendors assist in applying patches or fine-tuning configurations of difficult to patch equipment to make?

These groups can assist in overall reduction while further plans are developed to address additional vulnerabilities.

3. Start by focusing on low-hanging fruit

Focus your initial efforts on the low-hanging fruit when building a plan to address vulnerabilities. Missing browser updates and applying updates to third-party browser software like Java or Adobe are likely to comprise the largest initial reduction efforts.

If software like Google Chrome or Firefox is missing the previous two years of security updates, it likely signifies the software is not being used. Some confirmation may be required, but the response is likely to remove software, not the application of patches.

To prevent a recurrence, there will likely be a need to revisit workstation and server imaging processes to determine if legacy, unapproved or unnecessary software is being installed as new devices are provisioned.

4. Leverage your end-users when needed

Don’t forget to leverage your end-users as a possible remediation vector. A single email you spend 30 minutes carefully crafting to include instructions on how they can self-update difficult-to-patch third-party applications can save you many hours of time and effort — compared to working with technical teams where the end result may be a reduction of fewer vulnerabilities.

However, end-user involvement should be an infrequent and short-term approach as the underlying problems outlined in cross-team collaboration (tip #2) are addressed.

This also provides an indirect approach to increasing security awareness via end-user engagement. Users are more likely to prioritize security when they are directly involved in the process.

5. Be prepared to get your hands dirty

Many of the vulnerabilities that exist will require a manual fix, including but not limited to:

  • Unquoted service paths in program directories
  • Weak or no passwords on periphery devices like printers
  • Updating SNMP community strings
  • Windows registry not set

While there is project downtime — or the security function is between remediation planning — focus on providing direct assistance where possible. A direct intervention provides an opportunity to learn more about the business and the people operating the technology in the environment. It also provides direct value when an automated process fails to remediate or cannot remediate identified vulnerabilities.

This may also be required when already stressed IT teams cannot assist in remediation activity.

6. Targeted patch applications can be effective for specific products

Some vulnerabilities may require the application of a specific update to address large numbers of vulnerabilities that automatic updates continuously fail to address. This is often seen in Microsoft security updates that did not apply completely or accurately for random months across several years and devices.

Search for and test the application of cumulative security updates. One targeted patch update may remediate dozens of vulnerabilities.

Once tested, use automated patch application tools like SCCM or remote management and monitoring (RMM) tools to stage and deploy the specific cumulative update.

7. Limit scan scope and schedules 

Vulnerability management seeks to identify and remediate vulnerabilities, not cause production downtime. Vulnerability scanning tools can unintentionally disrupt information systems and networks via the probing traffic generated towards organization devices or equipment.

Suppose an organization is onboarding a new scanning tool or spinning up a new vulnerability management practice. In that case, it is best to start scanning a small network subset that represents the asset types deployed across the network.

Over time, scanning can be rolled out to larger portions of the network as successful scanning activity on a smaller scale is consistently demonstrated.

8. Leverage analytics to focus remediation activity 

Native reporting functions provided by vulnerability scanning tools typically lack effective reporting functions that assist in value-add vulnerability reduction. Consider implementing programs like Power BI, which can help the organization focus on the following:

  • New vulnerabilities by type or category
  • Net new vulnerabilities
  • Risk severity ratings for groups of or individual vulnerabilities
9. Avoid overlooking compliance pitfalls or licensing issues

Ensure you fully understand any licensing requirements in relation to enterprise usage of third-party software and make plans to stay compliant.

As software evolves, its creators may look to harness new revenue streams, which have real-world impacts on vulnerability management efforts. A classic example is Java, which is highly prevalent in organizations across the globe. As of 2019, Java requires a paid license subscription to receive security updates for Java.

Should a third party decide to perform an onsite audit of the license usage, the company may find itself tackling a lawsuit on top of managing third-party software security updates…[…] Read more »….


Key Steps for Public Sector Agencies To Defend Against Ransomware Attacks

Over the past two years, the pandemic has fundamentally altered the business world and the modern work environment, leaving organizations scrambling to maintain productivity and keep operational efficiency intact while securing the flow of data across different networks (home and office). While this scenario has undoubtedly created new problems for businesses in terms of keeping sensitive data and IP safe, the “WFH shift” has opened up even greater risks and threat vectors for the US public sector.

Federal, state, local governments, education, healthcare, finance, and nonprofit organizations are all facing privacy and cybersecurity challenges the likes of which they’ve never seen before. Since March 2020, there’s been an astounding increase in the number of cyberattacks, high-profile ransomware incidents, and government security shortfalls. There are many more that go undetected or unreported. This is in part due to employees now accessing their computers and organization resources/applications from everywhere but the office, which is opening up new security threats for CISOs and IT teams.

Cyberthreats are expected to grow exponentially this year, particularly as the world faces geopolitical upheaval and international cyberwarfare. Whether it’s a smaller municipality or a local school system, no target is too small these days, and everyone is under attack due to bad actors now having more access to sophisticated automation tools.

The US public sector must be prepared to meet these new challenges and focus on shoring up vulnerable and critical technology infrastructures while implementing new cybersecurity and backup solutions that secure sensitive data.

Previous cyber protection challenges

As data volumes grow and methods of access change, safeguarding US public sector data, applications, and systems involves addressing complex and often competing considerations. Government agencies have focused on securing a perimeter around their networks, however, with a mobile workforce combined with the increase in devices, endpoints, and sophisticated threats, data is still extremely vulnerable. Hence the massive shift towards a Zero Trust model.

Today, there is an over-reliance on legacy and poorly integrated IT systems, leaving troves of hypersensitive constituent data vulnerable; government agencies have become increasingly appealing targets for cybercriminals. Many agencies still rely on outdated five-decade-old technology infrastructure and deal with a multitude of systems that need to interact with each other, which makes it even more challenging to lock down these systems. Critical infrastructure industries have more budget restraints than ever; they need flexible and affordable solutions to maintain business continuity and protect against system loss.

Protecting your organization’s data assets

The private sector, which owns and operates most US critical infrastructure, will continue being instrumental in helping government organizations (of all sizes) modernize their cyber defenses. The US continues to make strides in creating specific efforts that encourage cyber resilience and counter these emerging threats.

Agencies and US data centers must focus on solutions that attest to data protection frameworks like HIPAA, CJIS, NIST 800-171 first and then develop several key pillars for data protection built around the Zero Trust concept. This includes safety (ensuring organizational data, applications, and systems are always available), accessibility (allowing employees to access critical data anytime and anywhere), and privacy and authenticity (control who has access to your organization’s digital assets).

New cloud-based data backup, protection and cybersecurity solutions that are compliant to the appropriate frameworks and certified will enable agencies to maximize operational uptime, reduce the threat of ransomware, and ensure the highest levels of data security possible across all public sector computing environments.


First and foremost, the public sector and US data centers must prioritize using compliant and certified services to ensure that specific criteria are met…[…] Read more »


What Digital Transformation Truly Means, with Srini Alagarsamy

Searching for Digital Transformation on Google fetches over five hundred million results, a good number of which aim to define the term. Following are my thoughts from a vantage point of having strategized and executed digital transformations in leading organizations. Over history, firms have had to transform many times, from the invention of money to the advent of electricity, and through the industrial, railroad, communication, and internet revolutions. Generally, transformations have occurred roughly every 50 years, and thus transformation is an iterative process; a journey that firms must embrace and understand that it is about continually getting better and not an end state. While few firms help shape new consumer preference categories, a vast majority must adjust, each time, reimagining their business model around those newly formed preferences – designing products, interactions, business processes all revolving around the customer.

Digital transformation is the latest iteration of business transformation, with firms adapting to new consumer preferences focused on digital channels. These preferences in the past two decades have largely tended to be at-the-glass (mobile, web) experiences. While digital twins have existed for some time, there is still a marked separation between the physical and virtual worlds. With the advent of Web 3.0 and Spatial Web, we are entering a different era of experience where these boundaries will continue to get blurred. Consumer preferences will shift from at-the-glass to inside-the-jar experiences.

In many firms, digital transformation conversations begin with discussions around Cloud, Agile, DevOps, AI, ML, Data Science, etc. While these are key building blocks, they are only a means to an end. They are the How, not the Why or the What. This is akin to picking up the hammer before knowing where the nails are, and in my humble opinion needs to change. Every company that aims to drive digital transformation ought to ask Why they need to change and What the customer will gain as part of that change. If they choose to go directly to the How they could call their efforts digitization, or digitalization, not digital transformation.

To truly embrace digital transformation, deliberate analysis of consumer needs, planning, and execution is important.


Start with Why:

Every firm must ask these questions before embarking on a digital transformation:

  •     Will this create a fundamental shift in customer experience for the good?
  •     Will this create net new opportunities for the firm or its customers?
  •     Will this create significant operational efficiencies for the firm?
  •     Will this create marketplace differentiation?


Define the What:

  •     Based on the Why, create the manifestations that reach customers best. These could be products, platforms, services, experiences.


Apply the How:

What must the firm do to bring the What to life effectively and efficiently? Some objective questions to ask here:

  •     Which enablers will get us there? Agile, Cloud, DevOps?
  •     Is a cultural transformation needed before you digitally transform?
  •     Should we build, buy, partner, or a combination of the three?


In summary, transformation is a continuous journey, and successful firms will constantly transform themselves in a way that will fundamentally alter the value they offer their customers. If done well, transformation should not feel like a project or a program, it’s business as usual. My suggestion is to start with the Why.


Srini Alagarsamy | Vice President, Digital Software Solutions at GM Financial

I am a technologist both by passion and profession. My first interaction with computers was in my late teens but I soon fell in love with developing software and experimenting with hardware. Professionally, I have been fortunate to be part of world-class organizations, driving major business and digital transformation initiatives. Through this blog, I intend to share perspectives I have gained and lessons I have learned about business and leadership.

Finding the right MSSP for securing your business and training employees

Over the past year, small businesses have had to navigate the pandemic’s many challenges — from changes in business models and supply shortages to hiring and retaining employees. On top of these pandemic-driven challenges, SMBs also faced a growing business risk: cybersecurity incidents.

Cybercriminals often target SMBs due to the limited security resources and training that leave these businesses vulnerable. According to a study, Verizon found 61% of all SMBs reported at least one cyberattack during 2020, with 93% of small business attacks focused on monetary gain. Unfortunately, this leaves many SMBs forced to close after an incident due to the high costs incurred during a cyberattack.

Cybersecurity is no longer just “nice to have” for SMBs, but many business owners don’t know where to start. And while measures like a VPN or antivirus system can help, they aren’t enough by themselves. Managed security service providers (MSSPs) are a valuable resource for SMBs, allowing them to bring in the expertise needed to secure infrastructure that they might not be able to afford in this highly competitive labor market.

When looking for an MSSP, hundreds of options often leave businesses overwhelmed. To learn more about the value MSSPs should and can bring to the table, I spoke with Frank Rauch and Shay Solomon at Check Point Software Technologies.

Koziol: What should small and medium business owners look for when selecting a cybersecurity MSSP? What are the must-haves and the nice-to-haves?

Rauch: We are living in a time where businesses, SMBs especially, cannot afford to leave their security to chance. SMBs are a prime target for cybercriminals, as SMBs inherently struggle with the expertise, resources and IT budget needed to protect against today’s sophisticated cyberattacks. We are now experiencing the fifth generation of cyberattacks: large-scale, multi-vector, mega attacks targeting businesses, individuals and countries. SMBs should be looking for a true leader in cybersecurity. They should partner with an MSSP that can cover all customer sizes and all use cases. To make it easy, we can focus on three key areas:

  1. Security. The best MSSPs have security solutions that are validated by renowned third parties. They should prove their threat prevention capabilities and leverage a vast threat intelligence database that can help prevent threats at a moment’s notice.
  2. Capabilities. MSSPs should be offering a broad set of solutions, no matter the size—from large enterprises to small businesses, data centers, mobile, cloud, SD-WAN protection, all the way to IoT security. Having this broad range of expertise will ensure that your MSSP is ready to cover your business in all instances.
  3. Individualized. This may be one of the most critical areas. Your MSSP should be offering flexible growth-based financial models and provide service and support 24/7 with real-time prevention. Collaborative business processes and principles will ensure success and security in the long run.

Koziol: How can SMBs measure the value of bringing in an MSSP? Or, the risks of inaction?

Rauch: The biggest tell-tale sign of a match made in heaven is if you’re receiving your security needs through one single vendor. If not, those options are out there! Getting the best security through one experienced, leading vendor can reduce costs, simplify, support and ensure consistency across all products. This ranges from simply protecting your sensitive data all the way to ensuring you can secure the business through a centralized security management platform. How can you protect what you can’t see?

It makes sense to keep an eye on how many cybersecurity attacks you’re preventing each month. How long is it taking you to create, change and manage your policies? Are you scaling to your liking? Can you adapt on the fly if need be? Are your connected devices secure? These are just some examples that you should be able to measure with simplicity.

Koziol: How has the shift in remote/hybrid workforce changed how cybersecurity MSSPs support SMBs?

Rauch: The shift to a larger work-from-home practice has caused attackers to shift their attacks outside of their network. It is more important now than ever for MSSPs to be providing their SMBs with a complete portfolio — endpoint, mobile, cloud, email and office — that allows them to connect reliably, scale rapidly and stay protected, no matter the environment.

The best MSSPs should have been ready for this day. At any moment, day or night, your organization can be victimized by devastating cybercrime. You can’t predict when cyberattacks will happen, but you can use proactive practices and security services to quickly mitigate their effects or prevent them altogether. The shift to a hybrid workforce exposed the holes in the existing security infrastructure.

On the bright side, security incidents present an opportunity to comprehensively reevaluate and improve information security programs. They show threat vectors that we previously overlooked and raise awareness across the organization to enhance existing or implement new controls. So at the very least, this shift has been an eye-opener for MSSPs.

Koziol: Should MSSPs offer security awareness and training as part of their offering? Why?

Solomon: Absolutely, yes. At the end of the day, knowledge is power. Cyberattacks are evolving and training can help keep SMB employees protected and educated. According to a study from VIPRE, 47% of SMBs leaders reported keeping data secure as their top concern. At the same time, many SMBs lack sufficient skills and capacity to drive improved security on their own.

The only way to fight cybercrime effectively is by sharing experiences and knowledge. Due to the cyber shortage, Check Point Software, along with 200 global training partners, recently announced a free cybersecurity training program called Check Point Mind. It offers many training and cybersecurity awareness programs to give SMBs (or any business) the chance to extend their skills with comprehensive cybersecurity training programs led by world-class professionals.

Koziol: How can working with an MSSP on security awareness education improve a business’s overall security posture?

Solomon: Raising awareness with employees is a crucial step that’s often overlooked. Employees need to be able to identify a phishing attempt and know how to react. In our experience, we see a majority of employees attacked using emails. They receive an email that looks like an official email from someone with authority, asking them to open attachments or click on a link that contains malicious intent.

If employees go through a training course that teaches them what to look for in an attack, this will surely reduce the chance of that employee falling victim to the phishing attempt.

Koziol: What questions should SMBs be asking their current or future MSSPs about cybersecurity?

Solomon: Building on what was mentioned earlier, it is never too late to reevaluate and improve information security programs. Asking questions and investing in a better security posture shows us threat vectors that we previously might have overlooked and raises awareness across the organization to the need to improve existing or implement new controls. SMBs must proactively approach their MSSPs to ensure they are getting the best bang for their buck—security solutions that require minimal configuration and simple onboarding. In addition, they need to ensure they are taking the proper steps when evaluating security architecture, advanced threat prevention, endpoint, mobile, cloud, email and office.

Koziol: What’s ahead for MSSPs in the cybersecurity space? What should SMB owners expect to see next?

Rauch: One of the key areas we’ll see continuously growing is the need for a next-generation cybersecurity solution that enables organizations to proactively protect themselves against cyberthreats: incident detection and response management. As attacks continue to evolve and grow in numbers, unified visibility is a must-have across multiple vectors that a cyberthreat actor could use to attack a network.

A common challenge we see is an overwhelming volume of security data generated by an array of stand-alone point security solutions. What’s needed is a single dashboard, or, in other words, unified visibility, that enables a lean security team to maximize their efficiency and effectiveness. SMBs should take the opportunity to check security investments. The highest level of visibility, reached through consolidation, will guarantee the best effectiveness…[…] Read more »….


Talent Shortage: Are Universities Delivering Well-Prepared IT Graduates?

The tech talent crunch is impacting organizations of all sizes. The lack of qualified IT specialists is a rising concern with no end in sight.

Taking into account accelerated retirement plans of the Baby Boomers and the “great resignation” spurred by the pandemic, tech companies will be even more reliant on the upcoming generation of university graduates to fill the ranks of data specialists, AI experts, and software engineering pros.

Josh Drew, Boston regional director at Robert Half, a staffing and talent solutions company, says he regularly sees first-year computer science graduates take job opportunities in the development space with salaries ranging from $90,000 to north of $100,000.

“If you look at the opportunities of coming directly out of school and the skillset they leave with, I think there is a clear indication the university formula is working,” he says.

He added that the pandemic, which forced almost all university students into a totally virtual learning space, has also prepared them for the more flexible, part-time remote work.

“They’ve been doing online classes instead of turning their assignments into teachers — they’re uploading it and hosting it on sites, sharing through Google or Slack,” Drew says. “The model fits well with the hybrid workplace in the sense that it’s not always on-site turning in hard work — it’s working in a virtual world or outside of the classroom.”

Changing Tech Landscape

However, there is some concern that the most in-demand skills are not being taught and that they aren’t providing graduates with soft skills like communications.

Catherine Southard, vice president of engineering at D2iQ, says her company hasn’t had much success finding new grads with experience in Kubernetes and the Go programming language, in which D2iQ’s product is primarily developed.

“Part of that is because the tech landscape changes so quickly. It would be great for a representative from tech companies — maybe a panel of CTOs — to sit down with curriculum developers every couple of years and talk through industry trends and where technology is headed, and then brainstorm how to bridge the gap between university and industry,” she says.

Southard added something students can do is research jobs that look interesting, then see what tech stack those companies are using. They can then equip themselves to land those jobs by studying up on that technology by using free resources online or taking courses.

Importance of Internship Programs

She sees another area of improvement in support for internship programs. Historically, D2iQ had a program in the US, but it was expensive to operate, and it didn’t lead to long-term employee retention, except for a couple of stand-out talents.

She noted larger tech companies can sponsor internship programs, but for startups, Southard would like to see universities splitting some of the operating costs as an investment in their students.

“We have had success hiring student workers in our German office, and that is an excellent setup for all involved,” she says. “We get smart, motivated students, the students get real-world experience, and our engineers can focus on more challenging problems as the students are able to perform more basic tasks.”

She explained a lot of people looking to change careers participate in code camps lasting a couple of months. These camps give them the necessary skills to hit the ground running as developers: Universities might do well to look at what those programs are doing and create a similar curriculum.

“A four-year degree is great, and there are lots of benefits to it, but it’s really not required anymore to be a developer,” Southard points out. “Universities should make sure their graduating students are as immediately employable as code camp graduates.”

Code Camps

Drew pointed out that in the Boston area, he’s seen the growth of these code camps, as well as different academies and even school contests geared around topics like ethical hacking.

“More than ever, the curriculum within the IT space is definitely like real-world applications,” he says.

He’s seen the development of e-commerce and entrepreneurial kind of programs where students are building and developing products to sell on websites.

“They’re bridging the gap with soft skills and teamwork, collaborating with others and often in a virtual environment,” Drew says.

Southard also notes that most physical science students will only have a few required English courses, and no communication courses, but success in the workplace will come down to their attitude and their abilities to collaboratively solve problems and communicate clearly.

“If universities could have a peer review process on programming assignments using standard industry tooling such as Github, that would help build some of these skills and better prepare students for their career,” Southard says.

From the perspective of Kevin Chandra, the Gen-Z co-founder and CEO of Typedream, his university experience at University of Southern California adequately prepared him for a career in the real world.

“Our universities teach us the fundamentals of computer science; the reason that they do this is because technologies change very quickly,” he says. “If universities were to teach industry-standard technologies, by the time you graduate it will all have changed.”

Exposure to Tech Community

Chandra says what he thinks is currently lacking in universities is providing IT students with more exposure to the tech community.

“I have learned from Twitter, Substack blogs, and podcasts about relevant trends, technologies, and marketing strategies much more than I could ever have from outdated books,” he says. “I wish universities invited thought leaders to lecture students at universities.”

Mohit Tiwari, co-founder and CEO at Symmetry Systems, agreed with Chandra that universities excel at teaching young IT professionals the type of fundamental technical skill sets that can set students up for decades.

“For example, students trained in programming languages, distributed systems, and data engineering can now work on critical infrastructure problems like privacy and cloud security,” he says.

More broadly, universities are also a critical staging area before the students are launched into production.

“The goal is providing those students with a safe place to make mistakes and learn in a cohort with mentors to help, and not to load them with every skill they will need for 30 years,” Tiwari added.

Tiwari says he thought tech companies could do a better reaching out to IT students or form relationships with higher education institutions by creating open-source testbeds that reflect real-world deployments that students can use as projects.

Chandra says he felt big tech companies, especially in the US, do a good job reaching out to university grads by providing internship opportunities starting from freshman year..[…] Read more »…..


Is a Merger Between Information Security and Data Governance Imminent?

As with any merger, it is always difficult to predict an outcome until the final deal papers are signed, and press releases hit the wires.  However, there are clear indications that a tie up between these two is essential, and we will all be the better for it.  Data Governance has historically focused on the use of data in a business, legal and compliance context rather than how it should be protected, while the opposite is true for Information Security.

The idea of interweaving Data Governance and Information Security is not entirely new.  Gartner discussed this in their Data Security Governance Model, EDRM integrated multiple stakeholders including Information Security, Privacy, Legal and Risk into an overarching Unified Data Governance model, and an integrated approach to Governance, Risk, and Compliance has long been an aspiration in the eGRC market.  Organizations that have more mature programs are likely to have some level of integration between these functions already, but many continue to struggle with the idea and often treat them as separate, siloed programs.

As programs go, Information Security is ahead of Data Governance for its level of attention in the Boardroom; brought about primarily by news-worthy events that demonstrated what security and privacy practitioners had warning about for a long time.   These critial risks to the public and private sectors inspired significant, sweeping frameworks and industry standards(PCI, NIST, ISO, ISACA, SOC2) and regulatory legislation (HIPAA, GDPR, NYDS), and gave Information Security Officers (CISOs) a platform for change.

By contrast, data governance has been more fragmented in its definition, organization, development, and funding.  Many organizations accept the value of data governance, particularly as a proactive means to minimize risk, while enabling expansive use of information required in today’s business environment.   However, enterprises still struggle to balance information risk and value, and establishing the right enablers and controls.


Risks and affirmative obligations associated with information are the primary drivers for the intersection of data governance and information security.  The reason that information security is so critical is that the loss ((through exfiltration or loss of access due to ransomware) of certain types of data carry legal and compliance consequences, along with impacting normal business operations.  And a lack of effective legal and compliance controls often lead to increased information security and privacy risk.

Additional common drivers include:

  • Volume, velocity, mobility, and sensitivity of information
  • Volume and complexity of legal, compliance, and privacy requirements
  • Hybrid technology and business environments
  • Multinational governance models and operations
  • Headline and business interruption risks

Finally, an underlying driver is the need to leverage investments in technology, practices, and personnel across an organization.  The interrelationships of so many information requirements simply demands a more coordinated approach.

Merging the models

We chose Information Risk Management, to define a construct that encompasses the overaching disciplines and requirements.  First, we did so because it places the focus on information.  For example, the same piece of information that requires protection, may also have retention and discovery requirements.  Second, risk management recognizes the need to balance the value and use of information from a business perspective, while also providing appropriate governance or protection.  Risk management also serves as an important means to evaluate priorities in investment, resources, and audit functions.

Information Risk Management
Figure 1: Information Risk Management

The primary objective is to integrate processes, people, and solutions into a framework that addresses common requirements; and does so “in depth” for both.  Security people, practices and technologies have long-been deployed at many levels (in-depth) to protect the organization.  The same has not often been the case for governance (legal, compliance, and privacy) obligations.  New practices and technologies are enablers for ntersecting programs, and support alignment amongst key constituencies, including Information Security, IT, Legal, Privacy, Risk and Compliance.  Done right, this provides leverage in an organization’s human and technology investments, improves risk posture, and increases the rate and reach of new practices and solutions.

Meshing disciplines and elements of each program are not meant as a new organizational construct; rather, it should start with a firm understanding of information requirements from key stakeholders; and from there establish synergies.  The list below, not meant to be exclusive, provides examples of shared enabling practices and technologies:

Shared Enablers and Requirements
Figure 2: Shared Enablers and Requirements


Integrating data governance, information security and privacy frameworks allows an enterprise to gain leverage from areas of common investment and provides a more comprehensive enterprise risk management strategy.  By improving proactive information management, organizations increase preventative control effectiveness and decrease reliance on detection and response activities.  It also develops cross functional capabilities across Privacy, Legal, Compliance, IT, and Information Security…[…] Read more »



Master Data Management (MDM) Framework With Arvind Joshi

Introduction – Reference Data vs. Master Data

It is very common for people to use ‘Reference Data’ and ‘Master Data’ interchangeably without understanding and appreciating the differences.

Reference data – External data that define the set of permissible values to be used by other data fields. Reference data gain in value when they are widely re-used and widely referenced. Typically, they do not change overly much in terms of definition, apart from occasional revisions. Example – Country Code, Asset Category, Vendor_ID and Currency Code, Industry Code, Security_ID (CUSIP, SEDOL, ISIN).

Master data – Internal dimensional data that directly participates in a transaction, like Customer_ID, Product_ID, Dept_ID and Employee_ID. Master data is critical for business and fall generally into four groupings: concepts, people, places, and things. Further categorizations within those groupings are called subject areas, domain areas, or entity types.

For example:

  • Within concepts, there are deals, contracts, warranties, and licenses.
  • Within people, there are customers, employees, and relationship managers.
  • Within places, there are office locations and geographic divisions.
  • Within things, there are products, business lines/units, and accounts.

Some domain areas may be further divided. Customer may be further segmented, based on relationships, market cap, incentives and history. A company may have normal customers, as well as premiere and executive customers. Product may be further segmented by sector, industry and geography/region.

The requirements and data life cycle for a product in the Financial Services Industry (FSI) is likely very different from those of the Insurance Industry. The granularity of domains is essentially determined by the magnitude of differences between the attributes of the entities within them.

Considerations – deciding why and what to manage

Master data is used by multiple applications, any error in master data will have ripple effect in all downstream applications consuming it. For example, an incorrect address in the customer master may mean orders, invoices/bills, confirms, and marketing literature are all sent to the wrong address. Similarly, an incorrect price on a Product Master can be a trade disaster, and an incorrect account number in an Account Master can lead to huge penalties.

Most organizations have more than one set of master data, this would be OK if it could be just union of the multiple master data sets, very likely some customers and products will appear in both sets of master data – usually, with different formats and different identifying keys. In most cases, customer IDs and product codes are assigned by the application that creates the master records, so the chances of the same customer or the same product having the same identifier in both databases is pretty remote.

Identifying master data entities is not complex, not all data that fits the definition for master data needs to be managed as such. The following criteria can be used to classify and identify master data attributes.

  • Interactions: Master data are the nouns and transactional data are the verbs in the data interactions. Review of these interactions can be used to identify and define master data. Facts (verbs) and dimensions (nouns) are represented in the similar way in a data warehouse. For example, in trading systems, master data is part of the trade record. An employee reports to their manager, who in turn reports up through another employee ﴾hierarchical relationship). Products can be part of multiple market segments and roll ups.
  • Data life cycle: Categorization of master data can be based on the way that it is created, read, updated, deleted, and searched. This data life cycle is different for different master‐data element types and industries. For example, how a customer is created depends largely upon business rules, industry segment, and data systems. There may be multiple customer creation paths, directly through customer on-boarding or through the operational systems. Additionally, how a customer element is created is certainly different from how a product element is created. 


Arvind Joshi – Director, Data Management and Analytics Lead at Scotiabank


Arvind serves as the Data Governance Officer for U.S. Finance, and as such he is the primary point of contact for all U.S. Finance data matters including with Fed regulators. In his time with Scotiabank, Arvind has championed data as a strategic asset and participated in data governance team, project, and leadership meetings like US Data Council, US Operating Committee and US Finance Committee. His team is responsible for the execution of day-to-day data governance and management activities, remediation of data quality issues, and implementation of change management processes. His team works closely with U.S. Data Office colleagues to establish and maintain strong data management capabilities, such as data quality measurement and monitoring, data issue management, and data lineage.


Chase CIO Gill Haus Discusses Recruitment, Agile, and Automation

The world of banking and finance faces aggressive change in innovation, increasing the need to adapt to new evolutionary cycles in financial technology. As customers want more resources and guidance with their finances, institutions such as JPMorgan Chase must nimbly respond in a way that belies their large size.

Gill Haus, CIO of consumer and community banking (Chase) at JPMorgan Chase, spoke with InformationWeek about his institution’s approach to finding the right tech talent to meet demands for innovation, the growing importance of automation, and the personal directives he follows.

When looking at technology recruitment, what skillsets is Chase seeking, both to meet current needs and also for what may come next?

At the root of what we do, we are in the business of building complex features and services for our customers. We have about 58 million digitally active customers; they depend heavily on the services we provide. Technology is behind all those products and services we offer. We are looking for the quintessential engineers that have the background in Java, machine learning engineers, those that have mobile experience as well. We also have technologies that are in “heritage” — systems that we’ve had for many years and we’re looking for engineers that understand how to use those technologies. Not just to support them but to modernize them. The key of our practice is to make sure also that we have those engineers and talent in general that is adaptable … because the market is constantly changing.

Why this is important is not just so we can have talent come in and help us build great solutions; it is also a great opportunity for talent to grow themselves. We provide our employees opportunities to use those new technologies whether it’s public cloud, private cloud, or machine learning. Also, to grow the breadth of their experiences, whether they’re working on mobile technologies, backend systems, or some other solution that touches millions and millions of customers. We offer our employees the opportunity, whether they are an entry-level software engineer, we have programs like our software engineer program where we bring in talent from universities and boot camps to do training. We offer things across the organization where our talent can contribute and learn with teams to build solutions, learn how to use other technology, and become more adaptable.

Gill Haus, JPMorgan Chase

Are there particular technologies or methodologies that have come into play of late that Chase has wanted to adopt or look at?

We’ve made a large move to be an agile organization to organize around our products versus organizing around our businesses. The reason for that is we need to be able to build solutions quickly and those local teams — the product, technology, data, and design leaders — they’re more able to see what’s happening in the market, make decisions quickly, decide what to build or what service to provide, and make sure we’re applying that for our customer versus being organized in a way that makes it more difficult to operate.

The move to an agile work style is really key for us to compete.

The other [part] is the skills themselves. At our scale, machine learning absolutely. We have tons of data about our customers, on how customers are using our products. Customers ask us to provide them insights or guidance. If you go into our mobile app, we have something called Snapshot that tells you how you’re spending money compared to other people like you, ways you can save. Machine learning is the essence and power behind making that happen.

Mobile engineering is also incredibly important for us because more and more of our customers are moving to be digitally active in the mobile space. We want to be where our customers are.

What isn’t often talked about is a lot of our backend services, which is the main Java programming that we do, empowers all of this. From APIs to public cloud because when you deposit money, you’re using those rails. When you are executing machine learning models, you’re still using a lot of those rails.

While we are focused on a lot of the new, we’re also focused on modernizing the core that we have because that is so fundamental to the services we provide.

In terms of scouting tech talent, is there an emphasis on finding brand new graduates of schools that offer the latest skills, retraining existing staff to make use of their institutional knowledge as well?

All the above. The purpose-driven culture we have is really a big factor for us. Money is at the center of people’s lives. If you can create a positive experience for customers in using their money, whether they are able to save more, to pay for something they didn’t expect, or prevent fraud for them, it provides an incredible positive benefit to that individual. That’s important. Many of the people joining, or already at that firm, want to have that positive impact.

One of our software engineering programs is called Tech Connect, which is how we get in software engineers who might not have come in through the traditional software engineering degrees. It’s a way for them to go through training here and find a role within the organization. We also have the software engineering program where we look at entry level candidates coming in from colleges with computer science and other engineering degrees. For employees that we have here, we have programs like Power Up, which is at 20 JPMorgan Chase technology centers where over 17,000 employees meet on an annual basis. There they learn all different types technologies, from machine learning, to data, to cloud. That allows us not only to have people that are here be trained but it makes it compelling to join the firm…[…] Read more »…..



Top 15 cybersecurity predictions for 2022

Over the past several years, cybersecurity risk management has become top of mind for boards. And rightly so. Given the onslaught of ransomware attacks and data breaches that organizations experienced in recent years, board members have increasingly realized how vulnerable they are.

This year, in particular, the public was directly impacted by ransomware attacks, from gasoline shortages, to meat supply, and even worse, hospitals and patients that rely on life-saving systems. The attacks reflected the continued expansion of cyber-physical systems — all of which present new challenges for organizations and opportunities for threat actors to exploit.

There should be a shared sense of urgency about staying on top of the battle against cyberattacks. Security columnist and Vice President and Ambassador-At-Large in Cylance’s Office of Security & Trust, John McClurg, in his latest Cyber Tactics column, explained it best: “It’s up to everyone in the cybersecurity community to ensure smart, strong defenses are in place in the coming year to protect against those threats.”

As you build your strategic planning, priorities and roadmap for the year ahead, security and risk experts offer the following cybersecurity predictions for 2022.

Prediction #1: Increased Scrutiny on Software Supply Chain Security, by John Hellickson, Cyber Executive Advisor, Coalfire

“As part of the executive order to improve the nation’s cybersecurity previously mentioned, one area of focus is the need to enhance software supply chain security. There are many aspects included that most would consider industry best practice of a robust DevSecOps program, but one area that will see increased scrutiny is providing the purchaser, the government in this example, a software bill of materials. This would be a complete list of all software components leveraged within the software solution, along with where it comes from. The expectation is that everything that is used within or can affect your software, such as open source, is understood, versions tracked, scrutinized for security issues and risks, assessed for vulnerabilities, and monitored, just as you do with any in-house developed code. This will impact organizations that both consume and those that deliver software services. Considering this can be very manual and time-consuming, we could expect that Third-Party Risk Management teams will likely play a key role in developing programs to track and assess software supply chain security, especially considering they are usually the front line team who also receives inbound security questionnaires from their business partners.”


Prediction #2: Security at the Edge Will Become Central, by Wendy Frank, Cyber 5G Leader, Deloitte


“As the Internet of Things (IoT) devices proliferate, it’s key to build security into the design of new connected devices themselves, as well as the artificial intelligence (AI) and machine learning (ML) running on them (e.g., tinyML). Taking a cyber-aware approach will also be crucial as some organizations begin using 5G bandwidth, which will drive up both the number of IoT devices in the world and attack surface sizes for IoT device users and producers, as well as the myriad networks to which they connect and supply chains through which they move.”


Prediction #3: Boards of Directors will Drive the Need to Elevate the Chief Information Security Officer (CISO) Role, by Hellickson


“In 2021, there was much more media awareness and senior executive awareness about the impacts of large cyberattacks and ransomware that brought many organizations to their knees. These high-profile attacks have elevated the cybersecurity conversations in the Board room across many different industries. This has reinforced the need for CISOs to be constantly on top of current threats while maintaining an agile but robust security strategy that also enables the business to achieve revenue and growth targets. With recent surveys, we are seeing a shift in CISO reporting structures moving up the chain, out from underneath the CIO or the infrastructure team, which has been commonplace for many years, now directly to the CEO. The ability to speak fluent threat & risk management applicable to the business is table stakes for any executive with cybersecurity & board reporting responsibilities. This elevated role will require a cybersecurity program strategy that extends beyond the standard industry frameworks and IT speak, and instead demonstrate how the cybersecurity program is threat aware while being aligned to each executive team’s business objectives that demonstrates positive business and cybersecurity outcomes. More CISOs will look for executive coaches and trusted business partners to help them overcome any weaknesses in this area.”


Prediction #4: Increase of Nation-State Attacks and Threats, by John Bambenek, Principal Threat Researcher at Netenrich


“Recent years have seen cyberattacks large and small conducted by state and non-state actors alike. State actors organize and fund these operations to achieve geopolitical objectives and seek to avoid attribution wherever possible. Non-state actors, however, often seek notoriety in addition to the typical monetary rewards. Both actors are part of a larger, more nebulous ecosystem of brokers that provides information, access, and financial channels for those willing to pay. Rising geopolitical tensions, increased access to cryptocurrencies and dark money, and general instability due to the pandemic will contribute to a continued rise in cyber threats in 2022 for nearly every industry. Top-down efforts, such as sanctions by the U.S. Treasury Department, may lead to arrests but will ultimately push these groups further underground and out of reach.”


And, Adversaries Outside of Russia Will Cause Problems


Recognizing that Russia is a safe harbor for ransomware attackers, Dmitri Alperovitch, Chairman, Silverado Policy Accelerator: “Adversaries in other countries, particularly North Korea, are watching this very closely. We are going to see an explosion of ransomware coming from DPRK and possibly Iran over the next 12 months.”


Ed Skoudis, President, SANS Technology Institute: “What’s concerning about this potential reality is that these other countries will have less practice at it, making it more likely that they will accidentally make mistakes. A little less experience, a little less finesse. I do think we are probably going to see — maybe accidentally or maybe on purpose — a significant ransomware attack that might bring down a federal government agency and its ability to execute its mission.”


Prediction #5: The Adoption of 5G Will Drive The Use Of Edge Computing Even Further, by Theresa Lanowitz, Head of Evangelism at AT&T Cybersecurity


“While in previous years, information security was the focus and CISOs were the norm, we’re moving to a new cybersecurity world. In this era, the role of the CISO expands to a CSO (Chief Security Officer) with the advent of 5G networks and edge computing.

The edge is in many locations — a smart city, a farm, a car, a home, an operating room, a wearable, or a medical device implanted in the body. We are seeing a new generation of computing with new networks, new architectures, new use cases, new applications/applets, and of course, new security requirements and risks.

While 5G adoption accelerated in 2021, in 2022, we will see 5G go from new technology to a business enabler. While the impact of 5G on new ecosystems, devices, applications, and use cases ranging from automatic mobile device charging to streaming, 5G will also benefit from the adoption of edge computing due to the convenience it brings. We’re moving away from the traditional information security approach to securing edge computing. With this shift to the edge, we will see more data from more devices, which will lead to the need for stronger data security.


Prediction #6: Continued Rise in Ransomware, by Lanowitz


“The year 2021 was the year the adversary refined their business model. With the shift to hybrid work, we have witnessed an increase in security vulnerabilities leading to unique attacks on networks and applications. In 2022, ransomware will continue to be a significant threat. Ransomware attacks are more understood and more real as a result of the attacks executed in 2021. Ransomware gangs have refined their business models through the use of Ransomware as a Service and are more aggressive in negotiations by doubling down with distributed denial-of-service (DDoS) attacks. The further convergence of IT and Operational Technology (OT) may cause more security issues and lead to a rise in ransomware attacks if proper cybersecurity hygiene isn’t followed.

While many employees are bringing their cyber skills and learnings from the workplace into their home environment, in 2022, we will see more cyber hygiene education. This awareness and education will help instill good habits and generate further awareness of what people should and shouldn’t click on, download, or explore.”


Prediction #6: How the Cyber Workforce Will Continue to be Revolutionized Among Ongoing Shortage of Employees, by Jon Check, Senior Director Of Cyber Protection Solutions at Raytheon Intelligence & Space


“Moving into 2022, the cybersecurity industry will continue to be impacted by an extreme shortage of employees. With that said, there will be unique advantages when facing the current so-called ‘Great Resignation’ that is affecting the entire workforce as a whole. As the industry continues to advocate for hiring individuals outside of the cyber industry, there is a growing number of individuals looking to leave their current jobs for new challenges and opportunities to expand their skills and potentially have the choice to work from anywhere. While these individuals will still need to be trained, there is extreme value in considering those who may not have the most perfect resume for the cyber jobs we’re hiring for, but may have a unique point of view on solving the next cyber challenge. This expansion will, of course, increase the importance of a positive work culture as such candidates will have a lot of choices of the direction they take within the cyber workforce — a workforce that is already competing against the same pool of talent. With that said, we will never be able to hire all the cyber people we need, so in 2022, there will be a heavier reliance on automation to help fulfill those positions that continue to remain vacant.”


Prediction #7: Expect Heightened Security around the 2022 Election Cycle, by Jadee Hanson CIO and CISO of Code42


“With multiple contentious and high-profile midterm elections coming up in 2022, cybersecurity will be a top priority for local and state governments. While security protections were in place to protect the 2020 election, publicized conversations surrounding the uncertainty of its security will facilitate heightened awareness around every aspect of voting next year.”


Prediction #8: A Shift to Zero Trust, by Brent Johnson, CISO at Bluefin


“As the office workspace model continues to shift to a more hybrid and full-time remote architecture, the traditional network design and implicit trust granted to users or devices based on network or system location are becoming a thing of the past. While the security industry had already begun its shift to the more secure zero-trust model (where anything and everything must be verified before connecting to systems and resources), the increased use of mobile devices, bring your own device (BYOD), and cloud service providers has accelerated this move. Enterprises can no longer rely on a specific device or location to grant access.

Encryption technology is obviously used as part of verifying identity within the zero-trust model, and another important aspect is to devalue sensitive information across an enterprise through tokenization or encryption. When sensitive data is devalued, it becomes essentially meaningless across all networks and devices. This is very helpful in limiting security practitioners’ area of concern and allows for designing specific micro-segmented areas where only verified and authorized users/resources may access the detokenized, or decrypted, values. As opposed to trying to track implicit trust relationships across networks, micro-segmented areas are much easier to lock down and enforce granular identity verification controls in line with the zero-trust model.”



Prediction #9: Securing Data with Third-Party Vendors in Mind Will Be Critical, by Bindu Sundareason, Director at AT&T Cybersecurity


Attacks via third parties are increasing every year as reliance on third-party vendors continues to grow. Organizations must prioritize the assessment of top-tier vendors, evaluating their network access, security procedures, and interactions with the business. Unfortunately, many operational obstacles will make this assessment difficult, including a lack of resources, increased organizational costs, and insufficient processes. The lack of up-to-date risk visibility on current third-party ecosystems will lead to loss of productivity, monetary damages, and damage to brand reputation.”


Prediction #10: Increased Privacy Laws and Regulation, by Kevin Dunne, President of Pathlock


“In 2022, we will continue to see jurisdictions pass further privacy laws to catch up with the states like California, Colorado and Virginia, who have recently passed bills of their own. As companies look to navigate the sea of privacy regulations, there will be an increasing need to be able to provide a real-time, comprehensive view of what data is being processed and stored, who can access it, and most importantly, who has accessed it and when. As the number of distinct regulations continues to grow, the pressure on organizations to put in place automated, proactive data governance will increase.”


Prediction #11: Cryptocurrency to Get Regulated, by Joseph Carson, Chief Security Scientist and Advisory CISO at ThycoticCentrify


“Cryptocurrencies are surely here to stay and will continue to disrupt the financial industry, but they must evolve to become a stable method for transactions and accelerate adoption. Some countries have taken a stance that energy consumption is creating a negative impact and therefore facing decisions to either ban or regulate cryptocurrency mining. Meanwhile, several countries have seen cryptocurrencies as a way to differentiate their economies to become more competitive in the tech industry and persuade investment. In 2022, more countries will look at how they can embrace cryptocurrencies while also creating more stabilization, and increased regulation is only a matter of time. Stabilization will accelerate adoption, but the big question is how the value of cryptocurrencies will be measured.  How many decimals will be the limit?”


Prediction #12: Application Security in Focus, by Michael Isbitski, Technical Evangelist at Salt Security


“According to the Salt Labs State of application programming interface (API) Security Report, Q3 2021, there was a 348% increase in API attacks in the first half of 2021 alone and that number is only set to go up.

With so much at stake, 2022 will witness a major push from nonsecurity and security teams towards the integration of security services and automation in the form of machine assistance to mitigate issues that arise from the rising threat landscape. The industry is beginning to understand that by taking a strategic approach to API security as opposed to a subcomponent of other security domains, organizations can more effectively align their technology, people, and security processes to harden their APIs against attacks. Organizations need to identify and determine their current level of API maturity and integrate processes for development, security, and operations in accordance; complete, comprehensive API security requires a strategic approach where all work in synergy.

To mitigate potential threats and system vulnerabilities, further industry-wide recognition of a comprehensive approach to API security is key. Next year, we anticipate that more organizations will see the need for and adopt solutions that offer a full life cycle approach to identifying and protecting APIs and the data they expose. This will require a significant change in mindset, moving away from the outdated practices of proxy-based web application firewalls (WAFs) or API gateways for runtime protection, as well as scanning code with tools that do not provide satisfactory coverage and leave business logic unaddressed. As we’ve already begun to witness, security teams will now focus on accounting for unique business logic in application source code as well as misconfigurations or misimplementations within their infrastructure that could lead to API vulnerabilities.

Implementing intelligent capabilities for behavior analysis and anomaly detection is also another way organizations can improve their API security posture in 2022. Anomaly detection is essential for satisfying increasingly strong API security requirements and defending against well-known, emerging and unknown threats. Implementing solutions that effectively utilize AI and ML can help organizations ensure visibility and monitoring capabilities into all the data and systems that APIs and API consumers touch. Such capabilities also help mitigate any manual mistakes that inadvertently create security gaps and could impact business uptime.”


Prediction #13: Disinformation on Social Media, by Jonathan Reiber, Senior Director of Cybersecurity Strategy and Policy at AttackIQ


“Over the last two years, pressure rose in Congress and the executive branch to regulate Section 230 and increased following the disclosures made by Frances Haugen, a former Facebook data scientist, who came forward with evidence of widespread deception related to Facebook’s management of hate speech and misinformation on its platform. Concurrent to those disclosures, in mid-November, the Aspen Institute’s Commission on Information Disorder published the findings of a major report, painting a picture of the United States as a country in a crisis of trust and truth, and highlighting the outsize role of social media companies in shaping public discourse. Building on Haugen’s testimony, the Aspen Institute report, and findings from the House of Representatives Select Committee investigating the January 6, 2021 attack on the U.S. Capitol, we should anticipate increasing regulatory pressure from Congress. Social media companies will likely continue to spend large sums of money on lobbying efforts to shape the legislative agenda to their advantage.”


Prediction #14: Ransomware To Impact Cyber Insurance, by Jason Rebholz, CISO at Corvus Insurance


“Ransomware is the defining force in cyber risk in 2021 and will likely continue to be in 2022. While ransomware has gained traction over the years, it jumped to the forefront of the news this year with high-profile attacks that impacted the day-to-day lives of millions of people. The increased visibility brought a positive shift in the security posture of businesses looking to avoid being the next news headline. We’re starting to see the proactive efforts of shoring up IT resilience and security defenses pay off, and my hope is that this positive trend will continue. When comparing Q3 2020 to Q3 2021, the ratio of ransoms demanded to ransoms paid is steadily declining, as payments shrank from 44% to 12%, respectively, due to improved backup processes and greater preparedness. Decreasing the need to pay a ransom to restore data is the first step in disrupting the cash machine that is ransomware. Although we cannot say for certain, in 2022, we can likely expect to see threat actors pivot their ransomware strategies. Attackers are nimble — and although they’ve had a ‘playbook’ over the past couple years, thanks to widespread crackdowns on their current strategies, we expect things to shift. We have already seen the opening moves from threat actors. In a shift from a single group managing the full attack life cycle, specialized groups have formed to gain access into companies who then sell that access to ransomware operators. As threat actors specialize in access into environments, it opens the opportunity for other extortion-based attacks such as data theft or account lockouts, all of which don’t require data encryption. The potential for these shifts will call for a great need in heavier investments in emerging tactics and trends to remove that volatility.”..[…] Read more »….


How should your company think about investing in security?

Like many things in life, with the security of your company’s application, you get what you pay for. You can spend too little, too much or just right.

To find the right balance, consider Goldilocks: she goes for a walk in the woods and comes upon a house. In the house, she finds three bowls of porridge. The first is too hot, the second is too cold, but the third is just right.

Goldilocks is the master of figuring out “just right.” To determine the appropriate security budget for your company, you need to be, too.

How much security effort is too much?

First, let’s explore the idea of overinvesting in security. How much is too much?

At a certain point with security, you start to see diminishing returns: issues still appear but more rarely. Security is never really “done,” so it’s tricky knowing when to move on. There’s always more to do, more to find, more to fix. Knowing when to wrap up depends on your threat model, risk appetite and your unique circumstances.

However, your company probably isn’t in this category. Almost nobody is. You certainly can get there, but you’re likely not there now. The takeaway is this: even though you’re probably not in this category yet, it’s important to know that security is not an endless investment of resources. There is a point at which you can accept the remaining risk and move forward.

The problem with too little effort

On the other hand, companies often spend too little effort on security. Almost everyone falls into this category.

Security is often viewed as a “tax” on the business. Companies want to minimize any kind of tax, and so they try to cut security spending inappropriately. However, most people don’t realize that when you cut costs, what you actually cut is effort: how much time you invest, how manual it is, how much attack surface you cover and how thoroughly you develop custom exploits. That’s a dangerous elixir because your attackers already invest more effort than you can. Cutting effort just cedes more advantage.

As a leader, you’re under tremendous pressure to make the best use of the limited money and person-power you have, and those resources need to cover a wide range of priorities. It’s sometimes hard to justify the investment in security, and even when you can, you aren’t always sure where the best place to invest it might be.

Here’s the harsh reality, though: the less you invest, the less it returns. When you cut costs too far, you prevent outcomes that help you get better. Achieving your security mission is going to cost you time, effort and money. There is no way around that. When those investments get cut to the bone, what’s really reduced is your ability to succeed.

The level of effort that’s “just right”

The trick to successful application security lies in finding your sweet spot, that magical balance where you uncover useful issues without investing too much or too little. There are many variables that influence this, including:

  • The value of your assets
  • The skills of your adversaries
  • The scope of your attack surfaces
  • The amount of risk you’re willing to accept

As a ballpark estimate, to do application security testing right is probably going to cost $30,000 to $150,000 or more per year, per application. Some cost far more than that.

That number might shock you; as discussed, most companies are in the category of spending too little. Security isn’t cheap because it’s not easy, it requires a unique skill set, and it takes effort.

However, doing security right is worth the price.

The incremental cost of doing security right is a tiny, microscopic spec compared to the gigantic cost of a security incident. Most importantly, since most companies struggle to do security right, those who do obtain an enormous advantage over their competitors. You want to be one of those companies. To get there, you need to invest appropriately.

There are no security shortcuts

Ultimately, you can’t achieve security excellence by going cheap. You can’t find the unknowns for cheap. You can’t discover custom exploits for cheap. You get what you pay for, and there’s no way around that. However, you also don’t need to spend endlessly either; even though there’s always more to fix, there is a point at which you can accept the remaining risk and move on.

The best approach is to channel your inner Goldilocks and find the budget that’s “just right” for your company. Figure out how rigorous and comprehensive an assessment your application requires, and don’t fall short of those standards…[…] Read more »….