How CTOs Can Innovate Through Disruption in 2020

CTOs and other IT leaders need to invest in innovation to emerge from the current COVID-19 crisis ready for the next opportunities.

Are you ready for 2021’s opportunities? Are you ready for the new business models that will emerge once the COVID-19 coronavirus is behind us? What strategic technology moves will your organization make today to invest in the innovation to bring your enterprise out of the current crisis, stronger and better?

CTOs and other senior technology leaders should now be focusing on these key questions as we enter the second half of 2020. Sure, it was critically important to pivot instantly to enable working from home in the first half of this year. Yes, there’s still work to be done improving the systems that enable employees to work from home, especially since organizations are making many of these arrangements permanent. However, the strategic longer term moves that senior leaders make today are what will help their organizations emerge stronger on the other side of this crisis.

CTOs are at risk now of focusing solely on short-term needs when it is equally important to plan for technology and innovation initiatives to help their organizations come out of the crisis and meet post-coronavirus challenges, according to a new report from Gartner, How CTOs Should Lead in Times of Disruptions and Uncertain.

Read all our coverage on how IT leaders are responding to the conditions caused by the pandemic.

Disruption is nothing new for technology leaders. In Gartner’s survey of IT leaders, conducted in early 2020 before the coronavirus pandemic struck, 90% said they had faced a “turn” or disruption in the last 4 years, and 100% said they face ongoing disruption and uncertainty. The current crisis may just be the biggest test of the resiliency they have developed in response to those challenges.

“We are hearing from a lot of clients about innovation budgets being slashed, but it’s really important not to throw innovation out the window,” said Gartner senior principal analyst Samantha Searle, one of the report’s authors, who spoke to InformationWeek. “Innovation techniques are well-suited to reducing uncertainty. This is critical in a crisis.”

The impact of the crisis on your technology budget is likely dependent on your industry, Searle said. For instance, technology and financial companies tend to be farther ahead of other companies when it comes to response to the crisis and consideration of investments for the future.

Other businesses, such as retail and hospitality, just now may be considering how to reopen. These organizations are still focused on fulfilling the initial needs around ensuring employees and customers are safe. In response to the short-term crisis, CTOs and other IT leaders were likely to focus on things like customer and employee safety, employee productivity, supply chain stabilization, and providing the optimal customer experience. But the innovation pipeline is also a crucial component.

Innovation doesn’t necessarily have to cost a lot of money. Budgets are tight, after all. Searle suggests incremental innovations and cost optimizations, gaining efficiencies where they are achievable.

Consider whether you’ve already made some investments in AI, chatbots, or other platforms. Those are tools that you can use to improve customer experience during the ongoing crisis or even assist with better decision making as you navigate to the future.

Remember, investments will pay off on the other side. For instance, companies that thought more about employing customer safety measures are the ones that will come out better in terms of brand reputation.

In a retail environment, for instance, an innovation for employee and customer safety might be replacing touch type with voice interactions.

Searle said that the crisis has also altered acceptance of technologies that may not have been desirable in the past. For instance, before the pandemic people generally preferred seeing a doctor face-to-face rather than via a telemedicine appointment.

“That’s an example of where societal acceptance of the technology has changed a lot,” she said.

Another example that was not quite ready for prime time as the crisis hit is the idea of drones and autonomous vehicles making deliveries of groceries, take-out orders, and other orders. However, those are technologies that companies can continue to invest in for the longer term benefits.

Another key action CTOs and other IT leaders should take is trendspotting, Searle said. Trends can be around emerging technologies such as AI, but they can also be economic or political, too. The current pandemic is an example that disruption is the new order, and that just focusing on emerging technology as the only perceived catalyst of disruption has been a a misstep by many organizations, according to Searle. She recommends that organizations use trendspotting efforts to assemble a big picture of trends that will impact technology strategic decisions as your organization begins to rebuild and renew.

In terms of challenges in the next 6 months, CTOs remain focused on the near term. In an online poll during a recent webinar, Searle asked CTOs just that question. The biggest percentage said that their challenge was improving customer experience at 31%. Other challenges were maintaining employee productivity (28%), infrastructure resilience (22%), supply chain stability (8%), and combatting security attacks (8%)…[…] Read more »…..

 

Democratizing Cybersecurity Protects Us All

Cybersecurity is a sophisticated art. It can truly consume the time and resources of IT teams as they work to safeguard valuable data from the growing risk of cyberattacks and data breaches. The technical nature of it, along with the specific expertise it requires, has created a workforce gap that many fear is nearly impossible to bridge.

By Akshay Bhargava, Chief Product Officer at Malwarebytes

In fact, the cybersecurity workforce gap has been reported to be over four million globally, causing an alarming void of security experts who are fit to protect business and consumer data. This gap is particularly painful for small and midsize businesses (SMBs) where recruiting cybersecurity expertise may be particularly costly or challenging. Unfortunately, with the average cost of a breach weighing in at a hefty $3.92 million, cybersecurity is not something any business – no matter the size – can afford to get wrong. This is especially concerning for SMBs where estimates have found that as many as 60% are forced to shut their doors after a cyberattack.

But the damage caused by a successful attack can extend beyond the SMB itself.

Not only will the SMB suffer in the event of a cyberattack, but the larger enterprises it partners with are also put at risk. Take the 2019 Quest Diagnostics data breach as an example. Nearly 12 million patients were exposed after hackers took control of a payments page for one of Quest’s billing collection vendors, AMCA, exposing account data, social security numbers and health information. The same attack also impacted 7.7 million customers of LabCorp. AMCA has since filed bankruptcy.

It’s also been reported that it was an email attack on a vendor of Target Corp. that exposed the credit card and personal data of more than 110 million consumers in 2013. The Target breach has been traced back to network credentials stolen from an email malware attack on a heating, air conditioning and refrigeration firm used by Target.

In each instance, the exposure of a smaller organization put a much larger enterprise at risk. There is hope though, that if we can democratize cybersecurity, SMBs could realize the same protections enterprises require, and we’d all be much safer as a result.

So, what can be done? How can SMBs achieve a cybersecure environment like their enterprise competitors? The key lies in automation and empowering employees.

Automation Unlocks Cybersecurity Democratization

Adopting security automation is an effective way to achieve cyber resilience without adding staff or cost burden. It’s the core of cybersecurity democratization. In fact, companies that fully deploy security automation realize an average $1.55 million in incremental savings when handling a data breach. Not only will automation relieve the pressure from continued staff and skills resource constraints, it’s also dynamically scalable, always on, and enables a more proactive security approach that makes the business exponentially more secure. When applying automation, consider each of these three critical security process areas:

1. Threat detection and prevention. Technologies including advanced analytics, artificial intelligence and machine learning give SMBs the ability to apply adaptive threat detection and prevention capabilities so that they can stay one step ahead of cybercriminals without added staff. By automating threat detection, powered by strong threat intelligence, SMBs can detect new, emerging threats while also increasing the detection and prevention of known threats that may have previously slipped past corporate defenses. Furthermore, they can reduce the noise from incident alerts and false positives from detection systems, improving overall threat detection and prevention success rates.

2. Incident responseIf a successful cyberattack does break through, it can move throughout an environment like wildfire. Incident response time is critical to mitigating the severity of the damage, and for those SMBs impacted by the security skills shortage, having the response team needed to react fast is likely a problem. By automating incident response, organizations can greatly improve their cyber resilience. Adopt solutions that will automatically isolate, remediate and recover from a cyberattack:

  •  Isolate. By automating endpoint isolation SMBs are able to rapidly contain an infection while also minimizing disruption to the user. Effective isolation includes the automated containment of network, device and process levels. Advanced solutions will also impede malware from “phoning home” which will restrict further damage to the environment.
  • Remediate. Automating remediation will quickly and effectively restore systems without requiring staff resource time or expertise. It will also allow CISOs to remediate endpoints at scale to significantly reduce the company’s mean-time-to-response.
  • Recover. Finally, incident response should also provide automated restore capabilities to return endpoints to their pre-infected, trusted state. During this recovery process it’s also wise to enable automated detection and removal of artifacts that may have been left behind during the incident. This is essential to preventing malware from re-infecting the network.

3. Security task orchestrationTo further relieve security staff while ensuring cyber resiliency, low-level tasks should be automated, including the orchestration between complex, distributed security ecosystems and services. This will ensure a more nimble and responsive environment in the event a cyberattack is successful. Cloud-based management of endpoints can help, specifically if it provides deep visibility with remediation maps[…] Read more »…..

This article first appeared in CISO MAG.

<Link to CISO MAG site: www.cisomag.com>

Leveraging packet data to improve network agility and reduce costs

Global enterprises spend over $100 billion a year on cybersecurity, but multi-vector threats can still find a way to invade network infrastructures. IT teams need to protect numerous and varied entry points, including mobile devices, and new technologies like the Internet of Things (IoT), virtualization, Wi-Fi hotspots and cloud applications.

At the same time, service providers need secure access to data centers, equipment and campus environments with near-zero network performance latencies. They must also gain visibility into encrypted traffic so they can safeguard their resources.

However, the most vital of these assets is packet data, which offers a shortcut to a comprehensive visibility-driven security program encompassing threat detection and precise investigative capabilities. IT teams can also add controls, flexibility and scalability by delivering the right packets to tools as needed. Throughout this process, they will improve recovery times and increase the return on investment for their cybersecurity budget.

The current landscape

Network administrators are working hard to meet the continuous demands for higher bandwidth while delivering a superior user experience. To do so, they need to gather real-time insights, improve productivity, and stay within monetary constraints. That’s a tough balance to strike, especially given the increased number of vulnerabilities affecting safety, governance, and compliance.

Over 20 billion connected devices are in use worldwide, and cybercriminals are updating their strategies to fit this new environment. Attackers exploit faster internet speeds, next-generation tools, and bad actor hosting sites, to create a wide range of sophisticated attacks. These can include malware, spam services, encrypted attacks to exfiltrate data, potential beaconing and C2 (Command and Control) communications, Distributed Denial of Service (DDoS) attack, and other malicious communications. They target networks and collect sensitive data from right under victims’ noses. With increased targeting of edge services, organizations must adopt a holistic approach to securing their entire distributed security visibility network to deliver the right packet data to their security systems. That begins with a comprehensive security visibility fabric architecture.

The most crucial preventive measure is rapidly addressing application performance issues through actionable insights. Operators can mitigate DDoS attacks at the edge quickly with automated solutions that protect packet data while minimizing risk. They should move storage workloads to the cloud as an extra layer of security.

IT teams who can’t see encrypted traffic face dangerous blind spots in their security, which could lead to financial losses, data breaches, and heaps of bad press. Because of this, it’s essential to protect networks and get smart visibility into these issues.

Regulatory bodies and organizations are shifting to the use of – and even mandating – ephemeral key encryption and forward secrecy (FS) to address the need for greater user security. The monitoring infrastructure will require companies to look at offloading Secure Socket Layer (SSL) decryption to allow tool capacity to keep up and to reduce latency by performing SSL decryption once and inspecting many times to scale the security infrastructure. Having a network packet broker in place to direct specific traffic to your SSL decryption appliance will allow for that decryption step. It will also enable the use of security service chaining to deliver the decrypted packet data to various security systems to maintain and monitor for optimal performance.

What the industry needs 

Many organizations don’t have the proper protective measures in place to fight attackers. They need to embed that capability into workflows because it allows for the rapid detection of issues within both physical and virtual infrastructures.

Enterprises are adopting emerging technologies to handle growing traffic volumes and network speeds. The increase in web applications and multimedia content has spurred a growing demand for simplified data center management, automation and cloud services. As a result, the packet broker market is flourishing with research predicting that the segment will be worth $849 million by 2023.

At the same time, network administrators must provide smart and flexible security solutions while reducing capital expenditures. IT teams can simplify these processes using distributed architecture. To do so, they need a cost-effective, scalable solution with no blind spots, which allows them to evolve packet data storage.

Operators and security administrators who base their actions on up-to-the-minute traffic reports can make decisions in real-time. Devices, applications and public and private clouds all aid in this mission by detecting threats throughout the network.

Why visibility is essential

Security is about controlling risk, and risk is defined by loss exposure. How can a business identify and manage risk? Companies need to be crystal clear on what they think about risk and have a thorough understanding of what they consider as assets. Having control is only possible with visibility into the network that provides access to those assets. Overcoming challenges and maximizing security requires a pervasive visibility layer that reduces downtime while increasing return on investment and enabling efficient operations.

The good news is enterprises are improving visibility as they analyze more information. IT departments need to follow suit by obtaining high-quality packet data and real-time insights. Tech teams can then protect systems from cyberattacks, provide reliable service assurance and comply with regulations.

Enterprises should monitor their infrastructure continuously so they can detect threats before they happen..[…] Read more »….

 

 

The ever changing role of a CSO with David Levine

With a wide and diverse variety of positions during his 23-year tenure with the Ricoh, Vice President Corporate and Information Security and CSO David Levine shares his perspective on the role of the CISO,  how he stays abreast of industry trends and in the current COVID-19 era, what it means to have a remote team. 

 

Q: How has the role of the CISO changed over your career?

A:  The CISO role has continued to grow in organizational and strategic importance within many businesses, including Ricoh. What was once a blended function in IT is now its own critical function with its leader (CISO/CSO) having a seat at the table and reporting, if applicable, to the board on a regular basis. That’s a significant transformation!

Q: What is the biggest challenge for a CISO today?

A: This ties into my answer above, the security budget and staffing has not necessarily kept pace with increasing demands and importance. As more and more of the organization as well as customers and partners realize they need to engage and include security the team gets spread thinner. This can put a real strain on the organization and its effectiveness. Prioritization and risk assessment become critical to help determine what needs to be focused on. You also cannot ignore the fundamental challenge of just keeping pace with operational fundamentals like vulnerability remediation, patching, alert response and trying to stay ahead of highly skilled adversaries. 

Q: How do you stay abreast of the trends and what your peers are doing?

A: I use a variety of approaches to track what’s going on relative to trends and my peers. Daily security email feeds are a great source to get a quick recap on the last 24 hours, leveraging one or more of the big research firms and being active in their councils is a great mix of access to analysts and peers. I am also active in the CISO community and participate in events run by great organizations like Apex. 

Q: What advice would you give an early stage CIO or CISO joining an enterprise organization?

A: Although I have been with Ricoh for many years, if I was moving to a new organization, I would take the time to ensure I understand:

 

  • the company’s objectives and priorities; 
  • what’s in place today and why;
  • what security’s role in the organization has been;
  • what’s working and what isn’t.

 

I’d also commit to completing initial benchmarking and make sure I spent time, upfront, to start to build solid relationships with key stakeholders.

Q: Have you been putting cloud migration first in your organization’s transformation strategies?

A: We adopted a cloud first mentality a few years ago. The cloud isn’t perfect for everything but in many cases it’s a great solution with a lot of tangible advantages.

Q: What are your Cloud Security Challenges?

A: For us, one of the biggest challenges is keeping pace with the business from a security and governance standpoint. We are currently working on putting in comprehensive policies and requirements, along with tools like a checklist, which will make it clear what’s needed and also enable the various teams to do some of the upfront work without needing to engage my team. That’s a win-win for everyone and reduces the likelihood of a bottleneck.

Q: What are your top data priorities: business growth, data security/privacy, legal/regulatory concerns, expense reduction?

A: YES! In all seriousness, those are all relevant priorities my team and I need to focus on. This further adds to the prior points around more work than hours and resources. 

Q: Did you have specific projects or initiatives that have been shelved due to COVID-19 and current realities?

A: Like most of my peers that I have talked to, we have put on hold most “net new” spending for now. The expectation is we will get back to those efforts a bit down the road. We are also taking a look to see what opportunities we have to streamline expenses.

Q: Has security been more of a challenge to manage while your teams have shifted to a Work From Home structure?

A: I am proud of my teams and the ecosystem we put in place. All in all, it’s been a pretty smooth transition. My team is geographically dispersed and a few key resources were already remote. However, that is not to say there aren’t any challenges – not being able to put hands on devices has made some investigations and project work more difficult but we’ve found safe ways to complete the tasks. Ensuring the teams stay connected and communicate is also important. 

Q: What were/are the most significant areas of change due to COVID-19?

A: We certainly had to make some exceptions to allow access and connectivity that we would not have done under normal circumstances, but it was the right thing to do for our business and our customers. We also had to shift some users to work from home who typically would not and as such, didn’t have the right resources. Both of these highlighted areas to focus on in the next revisions of our Business Continuity Plans which contemplated the need to shift work and locations but not necessarily everyone working from home. There is also a need to reemphasize security, policies, training when working from home.

Data Privacy and Data Security: Outsourcing to Third Parties and the Effect on Consumers, Companies, and the Cybersecurity Industry as a Whole

With the recent increase of global data privacy regulations and their ramifications on multinational organizations, it is crucial to examine the differences between data privacy and data security, why these nuances matter, and the impact they have on cybersecurity trends for not only organizations, but consumers.

Twenty years ago, data protection and information security were largely viewed as complementary activities. In today’s environment, data protection is rarely articulated without its privacy counterpart, and information security has transformed into “cybersecurity” to consider that data contains multiple threat factors.

Typically, cybersecurity is described as an intersection of three principles: confidentiality, integrity, and availability (CIA). If one of these core components is to fail or otherwise be wrongly configured, the resulting vulnerability could be a breach of information, commonly by means of unauthorized access, leakage, or wrongful deletion due to poor policy, risk management, or immature security practice.

Data privacy is often defined as the protection of sensitive data, typically referencing personally identifiable information (PII), such as a social security number, race, ethnicity, and age. Depending on the sector, regulation, or jurisdiction, the definition of which data is considered “sensitive” will vary and can expand beyond personal types of information to assets like trade secrets, intellectual property, or financial and operational data. The problem with this definition of data privacy is that the protection of this information is viewed more as a security attribute, lending to the longstanding proverb that you cannot have privacy without security.

If you reflect on the information trends since the turn of the last millennium, we experienced a shift to the cloud in the early 2000s, where organizations moved servers and other hardware assets to centralized vendors that maintain data center environments at scale. With this migration, the world’s first Software-as-a-Service (SaaS) companies came online at the height of the dot-com bubble.

The “as a service” business model placed a new dependence on service organizations when their customers outsourced critical elements of their supply chain for operational efficiencies or for the ability to scale quickly without having to gain expertise in an industry not core to their product. This reliance on third parties created increased security risks since more companies would now have access to the same information that was previously received, managed, and maintained all under the same roof.

The effect on consumers

Beginning in the 2010s, data breaches that affected consumers due to stolen credit card data, like those disclosed by Adobe, Target, and Home Depot all occurring within the same year, made data security a hot topic for consumers for the first time, causing boards and regulators to inquire about the controls in place to mitigate these threats. However, it was not until recently that consumers shifted that mindset to include data privacy, after public breaches exposed health and personal information at Anthem, Uber, Adult Friend Finder, and Marriott. These data breaches made headlines, and consumers began to ask, ‘what data are you storing for me, how do you plan to use this data, and how long will it be retained?’.

Lawmakers and regulators took notice of this shift to consumer protectionism and began to mandate public changes in normal business operations in lieu of federal privacy laws.

The effect on companies

With so many checkpoints to consider when engaging a new vendor, and the stakes for proper due diligence higher than ever, organizations began to turn to assessment firms for assurance around these security controls. Assistance is needed because companies are unable to audit every service provider that might interact with user or customer data. In the United States, an organization may request a System and Organization Controls (SOC) 2 report, an examination by a competent Certified Public Accountant (CPA) of their security controls based on set criteria. Or they may seek ISO 27001 certification, an accredited, point-in-time report on the conformity of their activities to requisite management processes and control objectives, establishing a baseline for what is considered a minimum state of security maturity.

Due to the shift in consumer focus on privacy considerations, globally recognized assurance programs have only recently been developed. In August 2019, the International Organization for Standardization (ISO) released the ISO 27701 standard – requirements and guidance for establishing a Privacy Information Management System (PIMS) for organizations that are controllers and/or processors of sensitive information like PII. While data privacy legislation had been around for several years through mechanisms like the EU-U.S. Privacy Shield and, more recently, the General Data Protection Regulation (GDPR), ISO 27701 is the first assurance program that organizations could certify demonstrating their commitment to privacy based on the legal context affecting their data subjects.

In the months following the release of ISO 27701, organizations such as Alibaba, Huawei, Microsoft, Accenture, Blackhawk Network, and OneTrust have certified to the new standard; however, these certified organizations plus a multitude of others looking to match the achievement have quickly realized that privacy hygiene requires different resources and in-house skill sets than were needed with their security program.

The challenges of incorporating data privacy

One of the top challenges security teams face when building a privacy program on top of their existing security management system is how to expand the enterprise risk assessment to include risks that threaten the protection of PII. They inherently gravitate towards thinking about this new taxonomy of risk in terms of the foundational CIA principles, but neglect to consider the rights of the data subject. As a result, they have been forced to merge security personnel with privacy personnel to complete this task, which now exposes a new problem – many organizations do not have privacy personnel.

Looking at some Fortune 500 organizations, job titles such as Chief Security Officer or Chief Information Security Officer (CISO) are far more commonplace than Chief Privacy Officer. Often, the privacy function of an organization is absorbed by General Counsel or outsourced to law firms kept on retainer. Early ISO 27701 certification plans at the largest processors of personal information in the world have been halted after discovering their security departments have little to no connection to their in-house privacy teams, if they exist at all. This results in a remediation only possible through a major shift in the organizational chart or hiring of competent personnel…[…] Read more »

 

Cyber Work Podcast: Growing the number of women in cybersecurity with Olivia Rose

Introduction

Cybersecurity is a field on the cutting edge, yet when it comes to gender parity, there’s still much progress to be made. For women, breaking into a male-dominated field like cybersecurity comes with a unique set of challenges.

Data from the (ISC)² Cybersecurity Workforce Report reveals that the landscape of women in cybersecurity is complex and — at least in some ways — evolving:

  • Women make up 24% of the cybersecurity workforce — a major increase from 11% in 2017
  • Women earn more degrees and cybersecurity certifications on average
  • More women than men hold leadership roles like IT Director, CISO and CIO

Seeing these numbers on the rise is exciting and encouraging. However, not all of the statistics are positive:

  • Of women in cybersecurity, 56% will leave to pursue jobs in another field
  • 17% of women earn salaries between $50,000 and $99,999, compared to 29% of men
  • Women in security management roles earn an average of $5,000 less than men in the same roles

In Infosec’s podcast “Growing the Number of Women in Cybersecurity,” Oliva Rose, the director of global executive risk solutions at Kudelski Security, shares her experiences as a woman in the field and shares some valuable advice with women considering a career in the cybersecurity world.

What can companies do to encourage women and minorities to take cybersecurity jobs? And just as important, how can companies encourage them to stay?

Network to overcome isolation

For many women working in cybersecurity, it’s unfortunately easy to feel like a stranger in a strange land. It’s not uncommon to be the only woman on a team or in an entire department, and the feeling of being the “odd woman out” can be enough to drive women to look for jobs in fields with better minority representation.

This leads us to the million-dollar question: what can cybersecurity companies do to make women feel less isolated at work? In this case, the most obvious answer (hire more women) is only one part of the equation, since retention rates for women in cybersecurity are also quite low.

According to Rose, access to networking opportunities is vital. Encouraging women to participate in conferences and professional groups can help them meet other women in the field and foster the sense of community they’ve been missing at work. For women trying to get their foot in the door, Rose suggests volunteering at conferences because it waives the fee! RSA, SecureWorld and ISACA are just a few of the many conferences available to women in information security.

Close the confidence gap

Self-doubt and insecurity can loom over women’s cybersecurity careers like storm clouds on an otherwise sunny day. Many women experience Imposter Syndrome, which is the perception that they’re not as skilled or as smart as their colleagues or that they’re not good enough for the job.

Although men can also experience extreme self-doubt at work, women and minorities are much more susceptible to it. Why? It largely stems from feeling like an outsider. This feeling of being on the outside looking in has ramifications on women’s careers in cybersecurity.

Many women feel the need to prove their skills with certifications and degrees. On average, women in cybersecurity hold more certifications than their male colleagues. They’re also more likely to earn a postgraduate degree, according to the (ISC)² Cybersecurity Workforce Report. Rose has experienced this herself, saying, “You have to know your stuff. You may have to know your stuff more than the five other guys in the room.”

How can we help women feel more confident in cybersecurity jobs? Networking and mentorship are two powerful strategies. Since self-doubt is something that can’t be fought in isolation, connecting women with peers who understand what they’re going through can be immensely beneficial.

Recruit from non-traditional backgrounds

Despite the long-running debate on the value of a college degree in cybersecurity, many recruiters still prefer to hire people with degrees in STEM. That alone disqualifies a huge number of professionals, many of them women, who would make a big contribution to the field.

To hire more women in information security roles, recruiters will have to break the mold and look beyond traditional education requirements. Why? Because women don’t graduate from STEM programs at the same rate as men. In the 2015-2016 school year, women earned only 18.7% of bachelor’s degrees in computer and information sciences..[…] Read more »….

 

 

7 Ways to Improve Software Maintenance

Here are some approaches and steps organizations can take to perform software maintenance while creating as much time as possible for new software development.

In 2019, Tidelift, an Opensource support and maintenance organization, conducted a survey of software developers that revealed that developers spent less than one third of their time (32%) developing new code. In the same survey, developers said that 35% of their time was spent on software maintenance.

My own experience in consulting with companies is that the amount of time spent on software maintenance is closer to 50%.

In either case, the time spent on maintaining software prevents organizations from pursuing new projects and getting things done.

At the same time, maintaining the software that you have created or inherited is a fact of life.

Software maintenance is defined as “a part of Software Development Life Cycle. Its main purpose is to modify and update software application(s) after delivery to correct faults and to improve performance. Software is a model of the real world. When the real-world changes, the software requires alteration wherever possible.”

Given this, what steps can organizations take to perform software maintenance while creating as much time as possible for new software development?

1. Listen to your help desk

No function in IT has a better finger on the pulse of application performance than the help desk. The help desk gets all of the questions and problems from users. The people who work the help desk know from the calls they get which applications are most problematic, and why. If more IT organizations patched help desk insights into their application development brainstorming and performance evaluations, they would be more successful identifying areas of persistent application problems and failures so these areas could either be addressed fully by repairing them or retired and replaced with another solution. Just as importantly, the knowledge gained from application trouble “hot spots” at the help desk can be learned from so the same mistakes aren’t repeated in new software development.

2. Engage QA

In too many organizations, developers up against tight deadlines tend to throw their work “over the wall” to QA at the last minute. Then, only partial application testing gets done before the app gets deployed into production. When the app goes live, there can be weeks of problem reports and troubleshooting, with fixes and workarounds resulting. Conversely, by thoroughly testing applications upfront for technical correctness, integration and usability, post-production software maintenance can be drastically reduced. To facilitate this, project managers need to plug in and ensure adequate times for software QA.

3. Consider a move to the cloud

Organizations using broken on-premises legacy software can consider making a break from endless maintenance by moving to a cloud-based version of the software that is offered and supported by the vendor. In a scenario like this, software maintenance is moved out of the shop and into the hands of the vendor. One disadvantage is that you never can be sure when the fixes or enhancements you want are going to get done — but the move could well be worth it if you can live with the inconvenience.

4. Sunset the applications that aren’t returning value

Almost every organization has a legacy system that no longer delivers the value it once did. This is a time to consider sunsetting that system and potentially planning a “rip and replace” with a new system. Rip and replace works when there are few needs to integrate the system with other software that is running. In cases where rip and replace is viable, you can shift much of your system maintenance for the new system to the supporting vendor.

5. Always regression test

The impulse when you’re under the gun to finish a project is to meet deadline and skip some of the quality tests. One critical test is the regression test, which places any application that is newly modified in a simulated production environment with other applications to test and ensure that integration with these other applications and called routines is working properly. When regression testing is skipped, risk heightens that a newly modified app will break or cause other pieces of systems to break because of a coding error that was introduced. This brings down systems and causes service outages..[…] Read more »…..

 

Here Come 5G IoT Devices: What Is “Reasonable Security”?

After years of waiting for 5G technology to transform industry and consumer devices, developments at this year’s Consumer Electronics Show suggest that 2020 may finally be the year when US companies make the leap.  Early signs show the healthcare and manufacturing sectors will lead the way this year in incorporating 5G and connected devices into their operations.

If the prognosticators are correct, our smart watches will soon talk to our refrigerators and order healthy groceries online.  And our doctors may receive real-time health updates from our workout equipment, pharmacies, and implanted medical devices.

The combination of 5G and the projected explosion in the number of IoT devices has industry excited, and the government focused on data security.  5G will allow massive evolution of products and services; leading to autonomous vehicles, remote surgery, and greater connectivity, automation, and precision in industrial manufacturing.  This coming integration and reliance on connected devices—the Internet of Things (IoT)—raises myriad new privacy and security concerns, and lawmakers and regulators are ready to take action.

The New Year brought new state laws in California and Oregon focusing specifically on security requirements for connected devices.  The laws are the first in the nation, and portend a coming wave of laws, lawsuits, and regulatory actions focused specifically on data security.  Lawmakers are wrestling with how to keep consumers safe in the face of rapid technological advancement, and are falling back on the concept of “reasonable security” to bridge the gap.  But reasonable security may not be an easy standard for engineers to implement.

The California and Oregon laws require manufacturers of connected devices to integrate reasonable security measures that (1) are appropriate to the nature and function of the device; (2) appropriate to the information the device may collect, contain, or transmit; and (3) designed to protect the device and its information from unauthorized access, destruction, use, modification, or disclosure.

This may seem like a simple threshold, but these laws’ definition of “connected devices” is expansive, potentially expanding the scope to include security cameras, household assistants, vehicles, and in the case of California, industrial manufacturing equipment.  Each different category of device is going to have a different level of sophistication, different uses, different interaction with data, and different manufacturing requirements.  What may be reasonable for a wifi-enabled juicer is not going to be reasonable for a connected vehicle.

The increasing inability of laws and policies to keep pace with advancements in technology means that efforts to address these issues are going to be crafted in an overly broad and flexible manner.  The California and Oregon laws, as well as similar efforts at the federal level, reflect a struggle to empower the government to address problems, the exact contours of which are not completely known or understood.  Rather than be behind the curve of a particular problem, these laws impose broad requirements that will evolve over time.

At the same time, laws run the risk of codifying standards that may be inapt or quickly become obsolete.  The California and Oregon laws provide that “reasonable security” can be satisfied by equipping a device with a unique preprogrammed password or a requirement that the user generate a new means of authentication before gaining access to the device for the first time.  This may be reasonable for some devices, but the law also covers devices where a compromise in security could result in significant physical harm, and where more stringent security requirements would be appropriate.

As security and encryption approaches continue to advance, the password requirements codified in the laws may actually be disincentives to the adoption of more effective—and reasonable—security practices.  So this is leaving engineers asking the question, what is reasonable security?

Unfortunately, “it depends” is the answer right now. Until regulators offer guidance on how they are going to interpret the requirements or, develop those standards through various enforcement actions, it will be up to manufacturers to develop industry-wide standards for what constitutes “reasonable security.”  This may be particularly challenging in light of the expansive scope of these laws.  The California Attorney General, at least, has previously endorsed the Center for Internet Security’s Critical Security Controls as a baseline for reasonable security.  And some industries, like the automotive industry, already have good track records and mechanisms to establish industry standards.  Emerging industries and existing companies unfamiliar with IoT and 5G, may not be in such an advantageous position..[…] Read more »

 

 

Supercomputers Recruited to Work on COVID-19 Research

A consortium forms to crunch data that might help researchers get a better understanding of the virus faster.

A convergence of technology resources is being put to work to find answers in the fight against COVID-19. The White House Office of Science and Technology Policy and the U.S. Department of Energy reached out to the technology sector, bringing together IBM and other supercomputing powerhouses to support research into the virus.

The combination of private industry, academic resources, and government entities thus far has assembled 16 supercomputer systems that boast some 775,000 CPU cores and 34,000 GPUs. That computing power is tasked with running huge calculations for molecular modeling, epidemiology, and bioinformatics in order to hasten the research time spent on the virus.

Spearheaded by IBM, the key partners in the COVID-19 High Performance Computing Consortium include Amazon Web Services, Google Cloud, Microsoft, Massachusetts Institute of Technology, Rensselaer Polytechnic Institute, NASA, and others. The consortium is accepting research proposals online, then matching researchers with computing resources that might best accelerate their efforts.

John Kolb, vice president, information services and technology and chief information officer at Rensselaer Polytechnic Institute (RPI), says high-performance computing is an area of expertise for the university. “We’re on our third-generation supercomputer, an IBM DCS system, that we put in place in November,” he says. “It’s the most powerful supercomputer for a private university in the country.”

Kolb says the supercomputer’s architecture is meant to move data in and out of memory very quickly in large quantities. That lets users take on data-intensive problems. “It’s also very well-suited for some of the machine learning and AI things our researchers are involved with,” he says.

The effort to fight COVID-19, Kolb says, may include a lot of modeling of very large data sets once they become available. “You can start to look at issues around the spread of the virus and mitigation of the spread,” he says. “There could be some drug repurposing and perhaps development of new therapeutic candidates.”

There may be opportunities for new materials to filter out the virus, Kolb says, or to create items that are in short supply now.

RPI uses the Summit supercomputer architecture system, which is the same system as some of the Department of Energy labs, he says. “It will be interesting to see if we can have runs here that scale up on Summit or do we have runs on Summit that we could take over.” Kolb believes most of the problems the consortium will deal with may be multivariant. For example, that could mean taking into account the number of people, density, the effectiveness of social distancing, and the capacity of hospitals. “We’re clearly trying to explore some things that may have some great promise, but there’s some great computing and science that need to come into play here,” Kolb says.

The greater emphasis in recent years on technology and compute in the public, private, and academic sectors may mean there can be more hands on deck to support research into the virus. “COVID-19 is going to see a fair amount of data analytics and the use of AI and machine learning tools to think through what are the most promising possibilities going forward,” Kolb says. “Across the country and world, we’re developing much more expertise in this area.”

IBM got involved in this fight believing it could coalesce a team around bringing computational capability to bear on investigating the virus, says Dave Turek, vice president of technical computing at IBM Cognitive Systems. “It was prompted by experiences IBM’s had applying computational biology, molecular dynamics, and material science to a variety of scientific problems,” he says.

Bringing scientific perspective and computing expertise together, Turek says, could create a set of resources that can be used broadly. It also gives researchers access to supercomputing they might not otherwise have, he says. “It a massive, massive amount of computing,” he says.

The way the consortium is established, other interested organizations can make their resources available as well, Turek says. “This is really a clearinghouse,” he says. “We have scientists and computer scientists sitting on review committees on proposals that are coming in to ensure the science is dedicated to the most appropriate platform to the task at hand.”

The momentum and application of technology such as supercomputers that was already underway could help narrow the time research efforts may take. “Even inside IBM, we did modeling on the evolutionary pathways of H1N1,” Turek says. “Those skills and experiences have been scaled up and leveraged over time”..[…] Read more »…..

 

How Small Businesses Can Protect Themselves from Cyberattacks

When most people think of cyberattacks, major data breaches at humongous companies like Equifax and Yahoo!, typically come to mind. This is perfectly understandable, as these are the attacks that impact the most people and always make headlines. But cybercriminals don’t limit their attacks to large companies–they also target countless small businesses every year. And in many cases, these attacks destroy businesses and livelihoods.

By Zack Schuler, Founder and CEO of NINJIO

There’s no reason to put it delicately: The state of cybersecurity in the world of small and medium-sized businesses (SMBs) is nothing short of alarming. Not only are SMBs relentlessly targeted by hackers, but they’re also woefully unprepared to defend themselves and unequipped to handle the aftermath. This is a status quo that has to change immediately–SMBs are the biggest engine of the U.S. economy and they’re at risk like never before.

The Scope of the Problem

Every year, cyberattacks cost small businesses an average of almost US$80,000, and losses can range up to US$1 million (according to a report by the Better Business Bureau). Meanwhile, a 2018 study by the Ponemon Institute found that more than two-thirds of SMBs reported that they had been targeted by a cyberattack within the preceding year. Substantial majorities of SMBs also agree that cyberattacks are becoming more targeted, severe, and sophisticated, but despite these facts, almost half of respondents say they have no understanding of how to protect against cyberattacks.


Key findings from the report
  • Every year cyberattacks cost small businesses an average of almost US$80,000, and losses can range up to US$1 million.
  • A survey reports 88 percent of small business owners felt their business was vulnerable to a cyberattack.
  • Almost two-thirds of small businesses fail to act following a cybersecurity incident.
  • 56 percent of SMBs say, defending mobile devices from cyberattacks is extremely challenging.
  • The top three attack vectors cited by SMBs are mobile devices, laptops, and cloud systems.
  • Just 16 percent of SMBs are “very confident in their cybersecurity readiness.”
  • 60 percent of SMBs lack a “cyberattack prevention plan.”

A recent survey by the U.S. Small Business Administration found that 88 percent of small business owners felt their business was vulnerable to a cyberattack. However, due to resource constraints, a lack of technical expertise, and the rapid pace of change in the cybersecurity world, they often feel helpless or ill-prepared to defend themselves against the vast range of cyberthreats they face.

In fact, a survey of more than 4,100 SMB cybersecurity professionals recently conducted by Forrester, found that almost two-thirds of small businesses fail to act following a cybersecurity incident. Even when the threat is right at their doorstep, many SMBs don’t know what to do.

The World is Changing for SMBs

There are many factors that contribute to the challenging cybersecurity situation for SMBs. First, digital operations are no longer optional for any company–even if your market is small and local, consumers are increasingly demanding the ability to do all their business online.

SMBs are changing the way they operate in the digital era. For example, a 2018 Cisco survey of SMBs found that the percentage of their networks that are on the cloud increased from 55 percent to 70 percent between 2014 and 2017. While almost 70 percent of SMBs say they’re making this transition for security reasons, an increased reliance on cloud-based services can also open up new vulnerabilities.

Meanwhile, other aspects of the digital transition have proved difficult for SMBs, 56 percent of which say, defending mobile devices from cyberattacks is extremely challenging. Ponemon reports that the top three cyberattack vectors cited by SMBs are mobile devices, laptops, and cloud systems.

The Ponemon report also discovered that issues such as a lack of money, out-of-date cybersecurity technologies, and insufficient personnel are all major obstacles cited by SMBs. But the main threat cited in the report is employee negligence, as phishing/social engineering attacks were reported more than any other, while negligent employees or contractors were cited as the top root cause of the data breaches.

How SMBs can Protect Themselves

According to the Forrester survey cited above, just 16 percent of SMBs are very confident in their cybersecurity readiness. Despite the fact that SMBs are increasingly concerned about cybersecurity, Forrester also found that almost half of them don’t have a clearly defined strategy for protecting themselves. This is a common theme in surveys of SMBs. A 2019 Keeper survey found that 60 percent of respondents lack a cyberattack prevention plan..[…] Read more »…..

This article first appeared in CISO MAG.

<Link to CISO MAG site: www.cisomag.com>