Cloud Native Driving Change in Enterprise and Analytics

A pair of keynote talks at the DeveloperWeek Global conference held online this week hashed out the growing trends among enterprises going cloud native and how cloud native can affect the future of business analytics. Dan McKinney, senior engineer and developer relations lead with Cloudsmith, focused on cloud native supporting the continuous software pipelines in enterprises. Roman Stanek, CEO of GoodData, spoke on the influence cloud native can have on the analytics space. Their keynotes highlighted how software development in the cloud is creating new dynamics within organizations.

In his keynote, Stanek spoke about how cloud native could transform analytics and business intelligence. He described how developers might take ownership of business intelligence, looking at how data is exposed, workflows, and platforms. “Most people are just overloaded with PDF files and Excel files and it’s up to them to visualize and interpret the data,” Stanek said.

There is a democratization underway of data embedded into workflows and Slack, he said, but being able to expose data from applications or natively integrated in applications is the province of developers. Tools exist, Stanek said, for developers to make such data analytics more accessible and understandable by users. “We want to help people make decisions,” he said. “We also want to get them data at the right time, with the right context and volume.”

Stanek said he sees more developers owning business applications, insights, and intelligence up to the point where end users can make decisions. “This industry is heading away from an isolated industry where business people are copying data into visualization tools and data preparation tools and analytics tools,” he said. “We are moving into a world where we will be providing all of this functionality as a headless functionality.” The rise of headless compute services, which do not have local keyboards, monitors, or other means of input and are controlled over a network, may lead to different composition tools that allow business users to build their own applications with low-code/no-code resources, Stanek said.

Enterprise understanding of what constitutes cloud is evolving as well. Though cloud native and cloud hosted sound similar, McKinney said they can be different resources. “The cloud goes way beyond just storing and hosting,” he said. “It is at the heart of a whole new range of technical possibilities.” Many enterprises are moving from on-prem and cloud-hosted solutions to completely cloud-native solutions for continuous software, McKinney said, as cloud providers expand their offerings. “It is opening up new ways to build and deploy applications.”

The first wave of applications migrated to the cloud were cloud hosted, he said. “At a very high level, a cloud-hosted application has been lifted and shifted onto cloud-based server instances.” That gave them access to basic features from cloud providers and offered some advantages to on-prem applications, McKinney said. Still, the underlying architecture of the applications remained largely the same. “Legacy applications migrated to the cloud were never built to take advantage of the paradigm shift that cloud providers present,” he said. Such applications cannot take advantage of shared services or pools of resources and are not suitable for scaling. “It doesn’t have the elasticity,” McKinney said.

The march toward the cloud has since accelerated with the next wave of applications to take advantage of the cloud were constructed natively, he said. Applications born and deployed with the technology of cloud providers in mind typically make use of continuous integration, orchestrators, container engines, and microservices, McKinney said. “Cloud-native applications are increasingly architected as smaller and smaller pieces and they share and reuse services wherever possible.”

Enterprises favor cloud-native solutions now for such reasons as the total cost of ownership, performance and security of the solution, and accommodating distributed teams, McKinney said. There is a desire, he said, to shift from capital expense on infrastructure to operational expense on running costs. These days the costs of cloud-native applications can be calculated fairly easily, McKinney said. Cloud-native resources offer fully managed service models, which can maintain the application itself. “You don’t have to think about what version of the application you have deployed,” he said. “It’s all part of the subscription.”

The ability to scale up with the cloud to meet increased demand was one of the first drivers of migration, McKinney said, but cloud-native applications can go beyond simple scaling. “Cloud-native applications can scale down to the level of individual functions,” he said. “It’s more responsive, efficient, and able to better suit increasing demands — particularly spike loads.”..[…] Read more »…..

 

Why You Need a Data Fabric, Not Just IT Architecture

Data fabrics offer an opportunity to track, monitor and utilize data, while IT architectures track, monitor and maintain IT assets. Both are needed for a long-term digitalization strategy.

As companies move into hybrid computing, they’re redefining their IT architectures. IT architecture describes a company’s entire IT asset base, whether on-premises or in-cloud. This architecture is stratified into three basic levels: hardware such as mainframes, servers, etc.; middleware, which encompasses operating systems, transaction processing engines, and other system software utilities; and the user-facing applications and services that this underlying infrastructure supports.

IT architecture has been a recent IT focus because as organizations move to the cloud, IT assets also move, and there is a need to track and monitor these shifts.

However, with the growth of digitalization and analytics, there is also a need to track, monitor, and maximize the use of data that can come from a myriad of sources. An IT architecture can’t provide data management, but a data fabric can. Unfortunately, most organizations lack well-defined data fabrics, and many are still trying to understand why they need a data fabric at all.

What Is a Data Fabric?

Gartner defines a data fabric as “a design concept that serves as an integrated layer (fabric) of data and connecting processes. A data fabric utilizes continuous analytics over existing, discoverable and inferenced metadata assets to support the design, deployment and utilization of integrated and reusable data across all environments, including hybrid and multi-cloud platforms.”

Let’s break it down.

Every organization wants to use data analytics for business advantage. To use analytics well, you need data agility that enables you to easily connect and combine data from any source your company uses –whether the source is an enterprise legacy database or data that is culled from social media or the Internet of Things (IoT).  You can’t achieve data integration and connectivity without using data integration tools, and you also must find a way to connect and relate disparate data to each other in meaningful ways if your analytics are going to work.

This is where data fabric enters. The data fabric contains all the connections and relationships between an organization’s data, no matter what type of data it is or where it comes from. The goal of the fabric is to function as an overall tapestry of data that interweaves all data so data in its entirety is searchable. This has the potential to not only optimize data value, but to create a data environment that can answer virtually any analytics query. The data fabric does what an IT architecture can’t: it tells you what data does, and how data relates to each other. Without a data fabric, companies’ abilities to leverage data and analytics are limited.

Building a Data Fabric

When you build a data fabric, it’s best to start small and in a place where your staff already has familiarity.

That “place” for most companies will be with the tools that they are already using to extract, transform and load (ETL) data from one source to another, along with any other data integration software such as standard and custom APIs. All of these are examples of data integration you have already achieved.

Now, you want to add more data to your core. You can do this by continuing to use the ETL and other data integration methods you already have in place as you build out your data fabric. In the process, care should be taken to also add the metadata about your data, which will include the origin point for the data, how it was created, what business and operational processes use it, what its form is (e.g.,  single field in a fixed record, or an entire image file), etc. By maintaining the data’s history, as well as all its transformations, you are in a better position to check data for reliability, and to ensure that it is secure.

As your data fabric grows, you will probably add data tools that are missing from your workbench. These might be tools that help with tracking data, sharing metadata, applying governance to data, etc. A recommendation in this area is to look for an all-inclusive data management software that contains not only all the tools that you’ll need build a data fabric, but also important automation such as built-in machine learning.

The machine learning observes how data in your data fabric is working together, and which combinations of data are used most often in different business and operational contexts. When you query the data, the ML assists in pulling the data together that is most likely to answer your queries…[…] Read more »…..

 

5 minutes with Vishal Jain – Navigating cybersecurity in a hybrid work environment

Are you ready for hybrid work? Though the hybrid office will create great opportunities for employees and employers alike, it will create some cybersecurity challenges for security and IT operations. Here, Vishal Jain, Co-Founder and CTO at Valtix, a Santa Clara, Calif.-based provider of cloud native network security services, speaks to Security magazine about the many ways to develop a sustainable cybersecurity program for the new hybrid workforce.

Security: What is your background and current role? 

Jain: I am the co-founder and CTO of Valtix. My background is primarily building products and technology at the intersection of networking, security and cloud; built Content Delivery Networks (CDNs) during early days of Akamai and just finished doing Software Defined Networking (SDN) in a startup which built ACI for Cisco.

 

Security: There’s a consensus that for many of us, the reality will be a hybrid workplace. What does the hybrid workforce mean for cybersecurity teams?

Jain: The pandemic has accelerated trends that had already begun before 2019. We’ve just hit an inflection point on the rate of change – taking on much more change in a much shorter period of time. The pandemic is an inflection point for cloud tech adoption. I think about this in three intersections of work, apps, infrastructure, and security:

  1. Work and Apps: A major portion of the workforce will continue to work remotely, communicating using collaboration tools like Zoom, WebEx, etc. Post-pandemic, video meetings would be the new norm compared to the old model where in-person meeting was the norm. The defaults have changed. Similarly, the expectation now is that any app is accessible anywhere from any device.
  2. Apps and Infrastructure: Default is cloud. This also means that expectation on various infrastructure is now towards speed, agility, being infinite and elastic and being delivered as a service.
  3. Infrastructure and Security: This is very important for cybersecurity teams, how do they take a discipline like security from a static environment (traditional enterprise) and apply it to a dynamic environment like cloud.

Security: What solutions will be necessary for enterprise security to implement as we move towards this new work environment?

Jain: In this new work environment where any app is accessible anywhere from any device, enterprise security needs to focus on security of users accessing those apps and security of those apps themselves. User-side security and securing access to the cloud is a well-understood problem now, plenty of innovation and investments have been made here. For security of apps, we need to look back at intersections 2 and 3, mentioned previously.

Enterprises need to understand security disciplines but implementation of these is very different in this new work environment. Security solutions need to evolve to address security & ops challenges. On the security side, definition of visibility has to expand. On the operational side of security, solutions need to be cloud-native, elastic, and infinitely scalable so that enterprises can focus on applications, not the infrastructure.

Security: What are some of the challenges that will need to be overcome as part of a hybrid workplace?

Jain: Engineering teams typically have experiences working across distributed teams so engineering and the product side of things are not super challenging as part of a hybrid workplace. On the other hand, selling becomes very different, getting both customers and the sales team used to this different world is a challenge enterprises need to focus on. Habits and culture are always the hardest part to change. This is true in security too. There is a tendency to bring in old solutions to secure this new world. Security practitioners could try to bring in the same tech and product he/she has been using for 10 years but deep down they know it’s a bad fit…[…] Read more »….

 

What Will Be the Next New Normal in Cloud Software Security?

Accelerated moves to the cloud made sense at the height of the pandemic — organizations may face different concerns in the future.

Organizations that accelerated their adoption of cloud native apps, SaaS, and other cloud-driven resources to cope with the pandemic may have to weigh other security matters as potential “new normal” operations take shape. Though many enterprises continue to make the most of remote operations, hybrid workplaces might be on the horizon for some. Experts from cybersecurity company Snyk and SaaS management platform BetterCloud see new scenarios in security emerging for cloud resources in a post-pandemic world.

The swift move to remote operations and work-from-home situations naturally led to fresh concerns about endpoint and network security, says Guy Podjarny, CEO and co-founder of Snyk. His company recently issued a report on the State of Cloud Native Application Security, exploring how cloud-native adoption affects defenses against threats. As more operations were pushed remote and to the cloud, security had to discern between authorized personnel who needed access from outside the office versus actual threats from bad actors.

Decentralization was already underway at many enterprises before COVID-19, though that trend may have been further catalyzed by the response to the pandemic. “Organizations are becoming more agile and the thinking that you can know everything that’s going on hasn’t been true for a long while,” Podjarny says. “The pandemic has forced us to look in the mirror and see that we don’t have line of sight into everything that’s going on.”

This led to distribution of security controls, he says, to allow for more autonomous usage by independent teams who are governed in an asynchronous manner. “That means investing more in security training and education,” Podjarny says.

A need for a security-based version of digital transformation surfaced, he says, with more automated tools that work at scale, offering insight on distributed activities. Podjarny says he expects most security needs that emerged amid the pandemic will remain after businesses can reopen more fully. “The return to the office will be partial,” he says, expecting some members of teams to not be onsite. This may be for personal, work-life needs, or organizations want to take advantage of less office space, Podjarny says.

That could lead to some issues, however, with the governance of decentralized activities and related security controls. “People don’t feel they have the tools to understand what’s going on,” he says. The net changes that organizations continue to make in response to the pandemic, and what may come after, have been largely positive, Podjarny says. “It moves us towards security models that scale better and adapted the SaaS, remote working reality.”

The rush to cloud-based applications such as SaaS and platform-as-a-service at the onset of the pandemic brought on some recognition of the necessity to offer ways to maintain operations under quarantine guidelines. “Employees were just trying to get the job done,” says Jim Brennan, chief product officer with BetterCloud. Spinning up such technologies, he says, enabled staff to meet those goals. That compares with the past where such “shadow IT” actions might have been regarded as a threat to the business. “We heard from a lot of CIOs where it really changed their thinking,” Brennan says, which led to efforts to facilitate the availability of such resources to support employees.

Meeting those needs at scale, however, created a new challenge. “How do I successfully onboard a new application for 100 employees? One thousand employees? How do I do that for 50 new applications? One hundred new applications?” Brennan says many CIOs and chief security officers have sought greater visibility into the cloud applications that have been spun up within their organizations and how they are being used. BetterCloud produced a brief recently on the State of SaaS, which looks at SaaS file security exposure.

Automation is being put to work, Brennan says, to improve visibility into those applications. This is part of the emerging landscape that even sees some organizations decide that the concept of shadow IT — the use of technology without direct approval — is a misnomer. “A CIO told me they don’t believe in ‘shadow IT,’” he says. In effect, the CIO regarded all IT, authorized or not, as a means to get work done…[…] Read more »…..

 

Meet Leanne Hurley: Cloud Expert of the Month – April 2021

Cloud Girls is honored to have amazingly accomplished, professional women in tech as our members. We take every opportunity to showcase their expertise and accomplishments – promotions, speaking engagements, publications, and more. Now, we are excited to shine a spotlight on one of our members each month.

Our Cloud Expert of the Month is Leanne Hurley.

After starting out at the front counter of a two-way radio shop in 1993, Leanne worked her way from face-to-face customer service, to billing, to training and finally into sales. She has been in sales since 1996 and has (mostly!) loved every minute of it. Leanne started selling IaaS (whether co-lo, Managed Hosting or Cloud) during the dot.com boom and expanded her expertise since I’ve been at SAP.  Now, she enjoys leading a team of sales professionals as she works with companies to improve business outcomes and accelerate digital transformation utilizing SAP’s Intelligent enterprise.

When did you join Cloud Girls and why?

I was one of the first members of Cloud Girls in 2011. I joined because having a strong network and community of women in technology is important.

What do you value about being a Cloud Girl?  

I value the relationships and women in the group.

What advice would you give to your younger self at the start of your career?

Stop doubting yourself. Continue to ask questions and don’t be intimidated by people that try to squash your tenacity and curiosity.

What’s your favorite inspirational quote?

“You can have everything in life you want if you will just help other people get what they want.”  – Zig Ziglar

What one piece of advice would you share with young women to encourage them to take a seat at the table?

Never stop learning and always ask questions. In technology women (and men too for that matter) avoid asking questions because they think it reveals some sort of inadequacy. That is absolutely false. Use your curiosity and thirst for knowledge as a tool, it will serve you well all your life.

You’re a new addition to the crayon box. What color would you be and why?

I would be Sassy-molassy because I’m a bit sassy.

What was the best book you read this year and why?

I loved American Dirt because it humanized the US migrant plight and reminded me how blessed and lucky we all are to have been born in the US.

What’s the most useless talent you have? Why?.[…] Read more »…..

 

Protecting Remote Workers Against the Perils of Public WI-FI

In a physical office, front-desk security keeps strangers out of work spaces. In your own home, you control who walks through your door. But what happens when your “office” is a table at the local coffee shop, where you’re sipping a latte among total strangers?

Widespread remote work is likely here to stay, even after the pandemic is over. But the resumption of travel and the reopening of public spaces raises new concerns about how to keep remote work secure.

In particular, many employees used to working in the relative safety of an office or private home may be unaware of the risks associated with public Wi-Fi. Just like you can’t be sure who’s sitting next to your employee in a coffee shop or other public space, you can’t be sure whether the public Wi-Fi network they’re connecting to is safe. And the second your employee accidentally connects to a malicious hotspot, they could expose all the sensitive data that’s transmitted in their communications or stored on their device.

Taking scenarios like this into account when planning your cybersecurity protections will help keep your company’s data safe, no matter where employees choose to open their laptops.

The risks of Wi-Fi search

An employee leaving Wi-Fi enabled when they leave their house may seem harmless, but it really leaves them incredibly vulnerable. Wi-Fi enabled devices can reveal the network names (SSIDs) they normally connect to when they are on the move. An attacker can then use this information to emulate a known “trusted” network that is not encrypted and pretend to be that network.  Many devices will automatically connect to these “trusted” open networks without verifying that the network is legitimate.

Often, attackers don’t even need to emulate known networks to entice users to connect. According to a recent poll, two-thirds of people who use public Wi-Fi set their devices to connect automatically to nearby networks, without vetting which ones they’re joining.

If your employee automatically connects to a malicious network — or is tricked into doing so — a cybercriminal can unleash a number of damaging attacks. The network connection can enable the attacker to intercept and modify any unencrypted content that is sent to the employee’s device. That means they can insert malicious payloads into innocuous web pages or other content, enabling them to exploit any software vulnerabilities that may be present on the device.

Once such malicious content is running on a device, many technical attacks are possible against other, more important parts of the device software and operating system. Some of these provide administrative or root level access, which gives the attacker near total control of the device. Once an attacker has this level of access, all data, access, and functionality on the device is potentially compromised. The attacker can remove or alter the data, or encrypt it with ransomware and demand payment in exchange for the key.

The attacker could even use the data to emulate and impersonate the employee who owns and or uses the device. This sort of fraud can have devastating consequences for companies. Last year, a Florida teenager was able to take over multiple high-profile Twitter accounts by impersonating a member of the Twitter IT team.

A multi-layered approach to remote work security

These worst-case scenarios won’t occur every time an employee connects to an unknown network while working remotely outside the home — but it only takes one malicious network connection to create a major security incident. To protect against these problems, make sure you have more than one line of cybersecurity defenses protecting your remote workers against this particular attack vector.

Require VPN use. The best practice for users who need access to non-corporate Wi-Fi is to require that all web traffic on corporate devices go through a trusted VPN. This greatly limits the attack surface of a device, and reduces the probability of a device compromise if it connects to a malicious access point.

Educate employees about risk. Connecting freely to public Wi-Fi is normalized in everyday life, and most people have no idea how risky it is. Simply informing your employees about the risks can have a major impact on behavior. No one wants to be the one responsible for a data breach or hack…[…] Read more »

 

 

Meet Andrea Blubaugh: Cloud Expert of the Month – February 2021

Cloud Girls is honored to have amazingly accomplished, professional women in tech as our members. We take every opportunity to showcase their expertise and accomplishments – promotions, speaking engagements, publications and more. Now, we are excited to shine a spotlight on one of our members each month.

Our Cloud Expert of the Month is Andrea Blubaugh.

Andrea has more than 15 years of experience facilitating the design, implementation and ongoing management of data center, cloud and WAN solutions. Her reputation for architecting solutions for organizations of all sizes and verticals – from Fortune 100 to SMBs – earned her numerous awards and honors. With a specific focus on the mid to enterprise space, Andrea works closely with IT teams as a true client advocate, consistently meeting, and often exceeding expectations. As a result, she maintains strong client and provider relationships spanning the length of her career.

When did you join Cloud Girls and why?  

Wow, it’s been a long time! I believe it was 2014 or 2015 when i joined Cloud Girls. I had come to know Manon through work and was impressed by her and excited to join a group of women in the technology space.

What do you value about being a Cloud Girl?  

Getting to know and develop friendships with the fellow Cloud Girls over the years has been a real joy. It’s been a great platform for learning on both a professional and personal level.

What advice would you give to your younger self at the start of your career?  

I would reassure my younger self in her decisions and to encourage her to keep taking risks. I would also tell her to not sweat the losses so much. They tend to fade pretty quickly.

What’s your favorite inspirational quote?  

“Twenty years from now you will be more disappointed by the things that you didn’t do than by the ones you did do, so throw off the bowlines, sail away from safe harbor, catch the trade winds in your sails. Explore, Dream, Discover.”  –Mark Twain

What one piece of advice would you share with young women to encourage them to take a seat at the table?  

I was very fortunate early on in my career to work for a startup whose leadership saw promise in my abilities that I didn’t yet see myself. I struggled with the decision to take a leadership role as I didn’t feel “ready” or that I had the right or enough experience. I received some good advice that I had to do what ultimately felt right to me, but that turning down an opportunity based on a fear of failure wouldn’t ensure there would be another one when I felt the time was right. My advice is if you’re offered that seat, and you want that seat, take it.

What’s one item on your bucket list and why?..[…] Read more »…..

 

 

How Object Storage Is Taking Storage Virtualization to the Next Level

We live in an increasingly virtual world. Because of that, many organizations not only virtualize their servers, they also explore the benefits of virtualized storage.

Gaining popularity 10-15 years ago, storage virtualization is the process of sharing storage resources by bringing physical storage from different devices together in a centralized pool of available storage capacity. The strategy is designed to help organizations improve agility and performance while reducing hardware and resource costs. However, this effort, at least to date, has not been as seamless or effective as server virtualization.

That is starting to change with the rise of object storage – an increasingly popular approach that manages data storage by arranging it into discrete and unique units, called objects. These objects are managed within a single pool of storage instead of a legacy LUN/volume block store structure. The objects are also bundled with associated metadata to form a centralized storage pool.

Object storage truly takes storage virtualization to the next level. I like to call it storage virtualization 2.0 because it makes it easier to deploy increased storage capacity through inline deduplication, compression, and encryption. It also enables enterprises to effortlessly reallocate storage where needed while eliminating the layers of management complexity inherent in storage virtualization. As a result, administrators do not need to worry about allocating a given capacity to a given server with object storage. Why? Because all servers have equal access to the object storage pool.

One key benefit is that organizations no longer need a crystal ball to predict their utilization requirements. Instead, they can add the exact amount of storage they need, anytime and in any granularity, to meet their storage requirements. And they can continue to grow their storage pool with zero disruption and no application downtime.

Greater security

Perhaps the most significant benefit of storage virtualization 2.0 is that it can do a much better job of protecting and securing your data than legacy iterations of storage virtualization.

Yes, with legacy storage solutions, you can take snapshots of your data. But the problem is that these snapshots are not immutable. And that fact should have you concerned. Why? Because, although you may have a snapshot when data changes or is overwritten, there is no way to recapture the original.

So, once you do any kind of update, you have no way to return to the original data. Quite simply, you are losing the old data snapshots in favor of the new. While there are some exceptions, this is the case with the majority of legacy storage solutions.

With object storage, however, your data snapshots are indeed immutable. Because of that, organizations can now capture and back up their data in near real-time—and do it cost-effectively. An immutable storage snapshot protects your information continuously by taking snapshots every 90 seconds so that even in the case of data loss or a cyber breach, you will always have a backup. All your data will be protected.

Taming the data deluge

Storage virtualization 2.0 is also more effective than the original storage virtualization when it comes to taming the data tsunami. Specifically, it can help manage the massive volumes of data—such as digital content, connected services, and cloud-based apps—that companies must now deal with. Most of this new content and data is unstructured, and organizations are discovering that their traditional storage solutions are not up to managing it all.

It’s a real problem. Unstructured data eats up a vast amount of a typical organization’s storage capacity. IDC estimates that 80% of data will be unstructured in five years. For the most part, this data takes up primary, tier-one storage on virtual machines, which can be a very costly proposition.

It doesn’t have to be this way. Organizations can offload much of this unstructured data via storage virtualization 2.0, with immutable snapshots and centralized pooling capabilities.

The net effect is that by moving the unstructured data to object storage, organizations won’t have it stored on VMs and won’t need to backup in a traditional sense. With object storage taking immutable snaps and replicating to another offsite cluster, it will eliminate 80% of an organization’s backup requirements/window.

This dramatically lowers costs. Because instead of having 80% of storage in primary, tier-one environments, everything is now stored and protected on object storage.

All of this also dramatically reduces the recovery time of both unstructured data from days and weeks to less than a minute, regardless of whether it’s TB or PB of data. And because the network no longer moves the data around from point to point, it’s much less congested. What’s more, the probability of having failed data backups goes away, because there are no more backups in the traditional sense.

The need for a new approach

As storage needs increase, organizations need more than just virtualization..[…] Read more »

 

Is Your Pandemic-Fueled Cloud Migration Sustainable?

COVID-19 shoved enterprises into the cloud. While remote work is sustainable, emergency cloud strategies are not.

Enterprises were already moving deeper into the cloud before the pandemic hit. Multi-year plans were replaced by emergency implementations to facilitate remote work and digital customer interactions. Businesses and their IT departments have been proud of their heroic efforts, but emergency implementations are not sustainable over the long-term.

“Regardless of what we did right or wrong, there was a rationalization behind it,” said George Burns, senior consultant for cloud operations at digital transformation agency SPR. “Now we need to take a step back and look at projects through a different prism.”

Governance

Data governance is non-optional for companies whether they’re regulated or not, especially with data regulations such as General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). Burns said some of his clients are having trouble finding data now that they’ve shoveled it into the cloud.

“We need to rearchitect some of these solutions we’ve put in place, but then we need to come up with implementation plans that are even less disruptive than we had during good times,” said Burns. “How do we bolt that on to what we already have to let our newly distributed workforce continue to function and continue to generate revenue? Do we have governance wrapped around that to make sure that we can monitor what we need to monitor and be compliant where we need to be compliant?”

Six months into the pandemic, organizations should realize that they’re accumulating unnecessary risks if they don’t address the longer-term governance issues.

Though governance tends to be viewed as an internal-facing function, its role doesn’t end there. In fact, a recent Forrester report discusses the link between sound governance and better customer service. In a customer service context, the report’s authors said governance should include a cross-functional governance board, technology governance, process governance and data governance.

That’s sound advice generally.

Security

An obvious victim of rapid cloud adoption is security. There was no time to fully assess the new architecture from a security standpoint because business continuity and employee safety were more important. However, the potential security vulnerabilities left unchecked keep the door open for internal and external bad actors.

“It really comes back to the fundamentals,” said Burns. “Do we have the right security wrapped around [our architecture] so that we’re not exposing any of our data or access points?”

Meanwhile, the pandemic has fueled cyber fraud spikes and many of those campaigns have target work-from-home employees. In August, Interpol revealed it had observed a 350% increase in phishing sites and a 600% increase in malicious emails. Home Wi-Fi routers also have been targeted and several family members in an employee’s home may be sharing computers regardless of who owns them.

Enterprises need to ensure they’re educating employees about the work-from-home security risks to their organizations, especially since many of those individuals are attempting to balance their personal and professional lives. Hackers and social engineers know distracted individuals are easy targets.

Time to reassess

When the pandemic hit, there was no time for long-term thinking. Now, there’s a window of opportunity that shouldn’t be squandered. Whether a second COVID-19 wave occurs or not, businesses have an opportunity to assess where they’re at and compare that with where they need to be to ensure longer-term resilience.

“People are starting to understand that we’re not going to go back to work like normal tomorrow,” said Burns. ” Really, it comes back to the fundamentals. Do we have the right technology in place? Are we moving in the right direction? We need KPIs that show us that.”

Digital transformation has taken on new meaning in 2020 because it isn’t just about responding to digital disruption anymore. It’s about doing whatever it takes to survive and thrive no matter what happens. Essentially, last year’s playbook is old news.

“The rules of the game have completely changed. We’re not solving for the same X anymore,” said Burns. “We’re solving new problems that we haven’t taken the time to identify. We need to put out fewer fires and make more strategic decisions.”

Otherwise, enormous amounts of technical debt will continue to accumulate..[…] Read more »…..

 

Cloud Strategies Aren’t Just About Digital Transformation Anymore

Organizations have been transferring more data, workloads, and applications to the cloud to increase the pace of innovation and organizational agility. Up until recently, the digital transformation was accelerating. However, cloud adoption recently got a major shove as the result of the crisis, which can be seen in:

  • Dramatic remote work spikes
  • Capital expenditure (CapEx) reductions
  • Business model adaptations to maintain customer relationships

In fact, in a recent blog, Forrester reported robust 2020 first quarter growth of top three providers with AWS at 34%, Microsoft Azure (59%), and Google Cloud Platform (52%). The driver, according to Vice President and Principal Analyst John Rymer, is “Faced with sudden and urgent disruption, most enterprises are turning to the big public cloud providers for help.”

“We are seeing a huge increase in our clients wanting to digitize in-person processes and ensure they are accessible 24/7 and integrated with existing technologies through utilizing cloud services [such as] developing contactless ordering systems for physical retail locations, which both reduce the need for face-to-face interaction, but also sync with existing POS and stock management systems,” said Bethan Vincent, marketing director at UK digital transformation consultancy Netsells Group. “This requires both API integrations and a solid cloud strategy, which seeks to build resilience into these new services, protecting against downtime and the knock-on effect of one system affecting another.”

Jiten Vaidya, PlanetScale

Jiten Vaidya, PlanetScale

Speaking of resiliency, there is a corresponding uptick in Docker and Kubernetes adoption. “We have seen an interest in databases for Kubernetes spike during the COVID-19 pandemic. Kubernetes had already emerged as the de facto operating system for computing resources either on-premise or in the cloud,” said Jiten Vaidya, co-founder and CEO of cloud-native database platform provider PlanetScale. “As the need for resiliency and scalability becomes top of the mind, having this uniform platform for database deployment is becoming increasingly important to enterprises.”

While business continuity isn’t the buzzy topic it was during the Y2K frenzy, many consulting firms and technology providers say it’s top of mind once again. However, it’s not just about uptime and SLAs, it’s also about the continuity of business processes and the people needed to support those business processes.

Greater remote work is the new normal

Chris Ciborowski, CEO and co-founder of cloud and DevOps consulting firm Nebulaworks, said many of his clients have increased their use of SaaS platforms such as Zoom and GitLab/GitHub source code management systems.

“While these are by no means new, there has been a surge in use as identified by the increased load on the platforms,” said Ciborowski. “These are being leveraged to keep teams connected and driving productivity for organizations that are not used to or built for distributed teams. [M]any companies [were] already doing this pre-pandemic, but the trend is pouring over to those companies that are less familiar with such practices.”

Chris Ciborowski, Nebulaworks

Chris Ciborowski, Nebulaworks

Dux Raymond Sy, CMO and Microsoft MVP + regional director at AvePoint, which develops data migration, management and protection products for Office 365 and SharePoint, has noticed a similar trend.

“Satya Nadella recently remarked [that] two years of digital transformation has happened in two months,” said Sy. “Organizations and users that were on the fence, have all adopted the cloud and new ways of working. They didn’t have a choice, but they are happy with it and won’t revert to the old ways.”

However, not all organizations have learned how to truly live in the cloud yet. For example, many have adopted non-enterprise, consumer communication and/or collaboration platforms, which have offered free licenses in response to COVID-19. However, fast access to tools can result in ad-hoc, unstructured and ungoverned processes.

“Adoption isn’t a problem anymore, but now productivity and security are. As we emerge from the post-pandemic world, organizations are going to need to clean up their shadow IT, overprivileged or external users that can access sensitive data they shouldn’t and sprawling collaboration environments,” said Sy. “The other mistake we are seeing organizations make is not continuously analyzing their content, finding their dark data, and reducing their attack profile. Organizations need to make a regular habit of scanning their environments for sensitive content and making sure it is where it is supposed to be or appropriately expire it if it can be deleted. Having sensitive content in your environment isn’t bad, but access to it needs to be controlled.”

Dux Raymond Sy, AvePoint

Dux Raymond Sy, AvePoint

All the cybersecurity controls organizations have been exercising under normal conditions are being challenged as IT departments find themselves enabling the sudden explosion of remote workers. In fact, identity and access management company OneLogin recently surveyed 5,000 remote workers from the U.S. and parts of Europe to gauge the cybersecurity risks enterprises are facing. According to the report, 20% have shared their work device password with their spouse or child, which puts corporate data at risk, and 36% have not changed their home Wi-Fi password in more than a year, which puts corporate devices at risk. Yet, 63% believe their organizations will be in favor of continued remote work post-pandemic. One-third admitted downloading an app on their work device without approval.

“Organizations everywhere are facing unprecedented challenges as millions of people are working from home,” said Brad Brooks, CEO and president of trusted experience platform provider OneLogin in a press release. “Passwords pose an even greater risk in this WFH environment and — as our study supports — are the weakest link in exposing businesses’ customers and data to bad actors.”

CapEx loses more ground to OpEx

SaaS and cloud have forever changed enterprise IT financial models, although many organizations still have a mix of assets on-premises and in the cloud. In the wake of the 2008 financial crisis, businesses increased their use of SaaS and cloud. Digital transformation further fueled the trend. Now, CFOs are taking another hard look at CapEx as they fret about cashflow.

Suranjan Chatterjee, Tata Consultancy Services

Suranjan Chatterjee, Tata Consultancy Services

“The pandemic has crystalized the fact that there are basically two types of companies today: those that are able to deliver digitally and connect to customers remotely, and those that are trying to get into this group,” said  Miles Ward, CTO at business and technology consulting services firm SADA. “Since the world turned on its head the past few months, we’ve seen companies in both groups jump on cloud-based tools that support secure connections, scaled communications, rapid development and system access from anywhere, anytime. Using these tools, companies can reduce their risk; nothing feels safer than going from three to five-year commitments on infrastructure to easy pay-as-you-go, and pay only for what you use, commitment-free systems.”

Business models have shifted to maintain customer relationships

Businesses negatively impacted by shelter in place and stay at home executive orders have reacted in one of two ways: adapt or shut down temporarily until the state or country reopens. The ones that have adapted have been relying more heavily on their digital presence to sell products or services online, with the former being supplemented with curbside pickup. The businesses that shut down completely tended to have a comparatively weak digital strategy to begin with. Those companies are the ones facing the biggest existential threat..[…] Read more »…..