The complexity of DevSecOps with Maria Schwenger

Apex talks to Maria Schwenger, AVP – Enterprise Digital Risk – Head of Application Security and Data Protection at American Family Insurance, to discuss how application security has changed with the rapid cloud adoption and what are some of the new approaches to application security and data protection.

 

Q: You have been leading digital transformation programs working exclusively in the DevSecOps space for several years now – according to your words “even before we could’ve imagined such a term”. What remained the same and what has changed in the DeSecOps implementations today? What are some of the new DevSecOps transformation strategies?

A: One thing that will always stay fundamentally the same is the very essence of the DevOps approach – the promise of speed in delivering value and the opportunity to adapt to the market needs at scale. This is the very reason why companies implement DevOps and more recently DevSecOps. The DevSecOps (also sometimes referred as Rugged DevOps) brings the additional notion of having the security implemented as early as possible into the development  process and in every phase of the Software Development Life Cycle (SDLC). The DevSecOps approach allows the security to be also applied in an agile manner by incrementally maturing the security practices within the CI/CD pipeline while accounting for the possible vulnerabilities and risks. These are the fundamentals of the 2 terms. 

Today we see a certain controversy between these 2 terms – some professionals get annoyed by having to talk about DevSecOps as a separate approach. They believe that if DevOps is done right, security will always be an integral part of the DevOps process. And, this is probably a fair statement. It is so nice if we can implement the security practices in a way that the security feels as an enabler (not a show stopper) to the DevOps process. Today, this is probably one of the most important transformation strategies around how we develop, release, and maintain code – the goal of having visibility and clear understanding of the vulnerabilities and the associated risk (the probability of being exploited and the possible impact to the business). No one will dispute the importance of end-to-end automation and tight integration of the security processes and tooling within the CI/CD pipeline, and why not – let’s experiment and sprinkle some Artificial Intelligence (AI) to additionally optimize the DevSecOps process! 

My take is simple – no matter what we call the process or combination of processes we decide to adopt (DevOps, DevSecOps, SecDevOps, SecOps, etc.) – the main goal is to establish and deepen the transparency and the trust between the 3 teams – development, security, and operations – and simplify the traditionally complex shared ownership in keeping our businesses on-line, safe, and agile. 

 

Q: What are some of the challenges of the modern DevSecOps? Where and how should we expect the security approach to change? 

A: I am glad to realize that more and more companies today understand the value of the tight integration between Dev, Sec, and Ops, and are attempting to establish the right level of automation and extended collaboration within their organizations. Every security and DevOps professional today knows that for a while the security was lagging behind the rapid agility required by DevOps, but now I see how companies are stepping forward to the right path addressing 4 main areas: 

  • Slow security processes contradicting the DevOps perspective of rapid agile/iterative delivery – traditionally the security processes were more manual and sequential, not iterative and fast and that caused delays into the DevOps cycle. The new approach here is to use the very opportunity specified by the “agile” definition – iteration!  If the development process is iterative and continuous (CI/CD means “continuous integration” and “continuous delivery”), the logical solution is to build also a “continuous security”. This is a huge opportunity for the security professionals today. 
  • Securing of new technologies at scale – By its own rapid experimentation nature, DevOps has rapidly adopted many new technologies that the security teams were not ready to support at the same rapid scale. The challenges came from adoption of new architectures (i.e., API, micro services), new technologies (i.e., cloud, containers, serverless), the role open source plays in the way we develop software today, etc. 
  • Another major area is the security and the efficiency of the DevOps process itself. Do we have a holistic view of the end-to-end development process? Can we guarantee that it is fully protected and secured? What is the best way to integrate “continuous security” into the DevOps cycle? Because, let’s make no mistake – an unsecure development process will most likely translate into unsecure production environment. We also know that the remediation of vulnerability findings is time consuming and can be complex requiring specific skills. This is an area where the “Sec” part of DevSecOps team should play a leading role. 
  • Last, but not least, every company should spend the time to rethink their global SDLC process according to the definition of their own digital transformation where security can never be an afterthought. Let’s not forget that DevSecOps is a people and cultural transformation as much as it is a technical, tooling, and process evolution. 

 

Q: You stated that due to the rapid cloud adoption and the agile DevOps, it seems that the traditional security tools and practices of application security simply cannot keep up with this demand? What is the new way to think about implementing application security? How do you see the DevSecOps vendors supporting these needs today and in the future? 

A: Yes, sadly enough, this is correct! In the last few years, application security has been seen as a hold up by the software development teams – something that is inefficient, takes time, is hard to do, and, in many cases, only available as a manual activity. 

The right approach is to rethink the entire SDLC process in a whole new way and to enhance it to a security enabled application development and deployment process. That means that security is an integral part of each step of the SDLC process. Let me throw a few terms out there, which are all targeting to establish the continuous security approach within DevOps and SDLC. Some colleagues talk about SSDLC – Secure Software Development  Life Cycle, meaning that we fully integrate the application security practices within every step of the development process. This also yields to yet another term – “Shift Left”, which is defined by moving the security testing as early (to the left) into the development process when there will be less changes required compared to remediating findings at a later time when the code is ready to be released. 

The security teams should also become enablers for the adoption of new technologies like Web Application firewalls, etc. that provide run time protection of the application layer. 

The vendor support here is an extremely important topic. Yes, we need modern application scanning tools that are easy to integrate within the CI/CD pipeline or, even better,  already pre-integrated within the development environment. I fully support the notion of testing the application code “from inside out”/”from within”. And, these tools should be intended to protect applications at both deployment and run time.  

 

Q: You have been helping many clients to move to the cloud as part of their digital transformation strategy. What are some of the most common challenges and what role does DevSecOps play in this? 

A: Oh, this is such a big topic – probably, for an entirely separate conversion. Let me see how I can summarise my thoughts at a high level. 

Many businesses today are defining and implementing their own “cloud first” strategy and the migration of applications to cloud is foremost in line. Since we are now entering a quite well established era of cloud adoption, the companies are looking to get more benefits from the digital transformation relying alike on both cloud adoption (for speed and cloud economics) and on DevOps (for agility and time to market.) Of course, any cloud adoption should be underpinned by effective security practices.

There are so many similarities between Cloud computing and DevSecOps. They are both pillars of digital transformation leading the business growth, both are accelerators for streamlining  processes and advanced automation, both are facilitators of global collaboration. They are also both accelerators for each other. Cloud provides “on demand” usage and scalability needed to develop and run applications. DevSecOps is often asked to be the bridge towards the cloud adoption – to lead in adopting the new architectures and new technologies, to perform the “stretch” within a hybrid cloud, and to integrate securely across the new business practices. 

Some of the common challenges are naturally coming from the above details. Companies need to grow skills and expertise to support their new cloud environments, to establish effective ways to manage their cloud spendings, budget, and forecasting while retaining full control over security of systems and data and compliance regulations. DevSecOps engineers need to lead with new technologies and architectures, executing “lift and shift” or building new “cloud native “ applications, handling multitude of deployments to new cloud environments with complex cloud configs, etc. The security (still listed as number one major concern) needs to handle extended access management controls and secure a much wider perimeter while being flexible and agile. Multi cloud adoption and complex integrations with 3rd party SaaS offerings often require additional skills and attention. In addition, all teams need to get used to the shared responsibility model where the cloud service provider is also part of the multidimensional collaboration.

There is a phrase that the cloud is a journey and not a destination (paraphrasing here). In respect to cloud and DeSecOps, it is a journey of building capabilities that enable our digital transformation for rapid business growth. 

 

Q: Looking at all security incidents, exposures, ransomware attacks, etc. today, what are some of the lessons learned in terms of Application security? For example – what are some of the Application Security takeaways from the Solarwind breach?

A: The SolarWinds incident brought a new dilemma to the AppSec practitioners. It showed clearly that our internal SDLC process can be compromised by attackers even though we do apply most of  the best practices of secure engineering, and that even companies who have in place most controls can still become vulnerable by using a trusted vendor.  We can review this from 2 separate angles: 

  • In house SDLC or security of our “software factory” – How well secured is our own application development process ( build, test, and deploy)? Is everything we do routinely today in the App Sec space enough to protect and secure our custom developed software, our homegrown applications? Do we have an end-to-end visibility over the SDLC  – from the moment a developer creates code and checks it into the repository all the way to what happens with this code when it is deployed in production and upgraded there? How effective is our security review process if the attackers had compromised the entire build process and used legitimate certificates to sign their code? 
  • Secure Vendor management – what is the best way to continuously evaluate and monitor the 3rd party software products and SaaS offerings we are utilizing today?

It is obvious, we need to be vigilant about our SDLC process and proactively monitor the way we design, build, and deploy our homegrown apps, as well as all integrations with 3rd party software. I see 2 main positive factors we can apply – “Shift left security” within the CI/CD pipelines within our DevSecOps process, and Zero Trust Architecture (ZTA) to protect our assets via continuous verification between systems, devices, applications, data stores, etc. based on “never trust – always verity” approach.

 

Q: You mentioned implementing a Zero Trust Architecture and you recently presented on Zero Trust at one of our Apex events. What was most exciting to you about this panel? 

A: I believe that the concept of Zero Trust is an imperative for our post-pandemic world. I am always excited to discuss the new approach to Zero Trust and what the organizations need to consider to successfully implement the Zero Trust Architecture today. 

Although Zero Trust is not new as a concept (I believe it was introduced sometime in 2010),  most conversations in the past have been around securing the perimeter and led in large by our networking and IAM teams. And, this is to be expected – our data was mainly located in the corporate data centers and accessed from the corporate network, with the implicit trust to everyone inside and generally protecting from external threats. However, our IT landscape has significantly changed today – data and applications are spread between “on premise” and multiple cloud providers and accessed based on the principles of “anywhere – anytime”. We also worry about both internal and external threat actors, and the trust is never given by default and always verified – and not only once, but continuously verified. For these reasons, the new approach to ZTA today is to invite new areas of expertise – people like me, who bring into the conversation the new perspectives of application/workflows and inter-system trust, the knowledge of data security life cycles, machine identity, etc. There is also a strong connection between ZTA and DevSecOps – both are seen as accelerators for the business, and both are bringing a new mindset, building a new culture across the enterprises. 

 

Q: You are also leading a Data Protection program. What are some of the top priorities of a modern Data Protection program? What role does DevSecOps play in it? Any secrets you can share?

A: With about 4000 confirmed data breaches in 2020, of which close to 60% targeted compromised PII, the data protection today is a critical segment of the enterprise cybersecurity readiness. (The data I cited by memory is from the Verizon 2020 Data Breach Investigation Report). Of course, the ultimate priority of any Data protection program is to ensure successful growth of the business based on secure and compliant data practises. A lot of emphasis and even enforcement today is also put on the data privacy function explicitly granting the consumers the rights to their data.  Establishing a good Data Protection Program includes building many capabilities across the areas of data security, data privacy, data governance, addressing audit/legal/regulatory concerns, driving expense reduction, etc. Lately though, I catch myself continuously explaining the importance of data classification because, in my mind, the data classification rules. Let me explain. How would we know what network segmentation or access controls are required for a newly implemented data flow or what safeguards and controls need to be put in palace (i.e., encryption/tokenization for PCI data)? The only way to gain this knowledge is to classify the data according to the company’s policies, understand the type of the data elements, and then – design appropriately.

DevSecOps plays a very important role within the Data Protection implementation. Not only that the data scientists benefit from the consolidated CI/CD process, but If properly applied, the Devsecops  practices bring automation, control, repeatability, auditability, ensure velocity, prevent misconfigurations, enforce permissions and and data retention policies, etc. These are all very important capabilities of the Data Loss protection/prevention (DLP).

To answer your last question – Unfortunately, there are no secrets here. The Data Protection and more specifically the DLP, require a great understanding of the business and its data, utilizing quite an expensive set of tools, and applying effective processes across the entire company. Although not easy, this is entirely possible when we keep the doors open for a wide, cross company collaboration. 

 

Q: I know you are writing a new book on cloud – can you give us some preview? What is the book about and what is the specific “cloud” point of view that you selected to write about? 

A: Now that the adoption of cloud is consistent and no longer just a hype, there have been many great publications in the wide spectrum of cloud topics – from the financial and business benefits of cloud adoption to the best practices or particular technical challenges like management of cloud resources in multi-cloud platform, migration to cloud, adoption of cloud native technologies, specifics of cloud security – you name it!  There are so many important topics to write about! However, after talking to many of my colleagues and cloud practitioners, I selected a bit of an unusual point of view. The book is intended to explore the value and the potential of cloud adoption as part of the rapid digital transformation of 21 century in our post-pandemic world. We try to answer questions like: “How cloud adoption can fuel the business growth and support the agility every company (big and small) needs in order to effectively compete today?” and “What is the impact of the pandemic on cloud adoption and all technologies fueled by cloud?”, etc.   It is a practical book that discusses some of the most common questions posed by IT leaders and cloud practitioners today and provides a point of view expressed by use cases to back up the new, post pandemic, cloud strategies. 

 

Q: You enjoy presenting at conferences, and it seems that people like your presentation style. Can you give us a few tips on creating a technical presentation? 

A: Oh, I am not sure I am an expert here, but here we go! Usually, I like to keep the presentation entertaining and interactive – asking the audience questions, giving a lot of examples, building up on a story or two, and using meaningful (sometimes provocative) images. Remember -”a picture is worth 1000 words” – there is a good reason why this expression originated in the early 20 century and is still very popular. A good idea is to keep the slides “uncluttered” although I tend to have some “busy” textual slides…(guilty here!) Also, I like to have a clear outline with crisp introduction and summary or lessons learned. I always try to leave the attendees with some kind of a follow up – like “Next steps” or a “Conversation starter”. Something to inspire people to keep thinking about the topic and bring the conversation to their companies and colleagues. 

 

Maria Schwenger – AVP – Enterprise Digital Risk – Head of Application Security and Data Protection at American Family Insurance

Maria is an innovative DevSecOps and Data Protection Leader well-known for leading multiple successful implementations of the vision of modern DevSecOps and her leadership in executing digital transformation in areas like IOT/Edge, AI, and Big Data Analytics. 

She specializes in leading organizations to effectively adopt and utilize new cloud technologies and new architecture paradigms like API/micro services, containerization, orchestration, serverless, etc. and applying DevSecOps and Agile best practices of secure Continuous Integration and Continuous Delivery. The results of her work on building “5 Star DevSecOps experience” demonstrate a multitude increase of efficiency and productivity gains in the development process leading to fast and secure product improvements. 

Currently Maria is concentrating on creating a comprehensive, but simple to implement DevSecOps practice that can be easily adopted across the board based on the best practices of secure engineering and data protection.

 

 

Share