Low-Code Player Grabs RPA for Automation

Low-code platform vendor Appian is looking to provide a single platform for automation, AI and low code with a new RPA acquisition.

The year 2019 marked a big one for consolidation among data, analytics, and related vendor companies as the industry reorganized for the cloud and big vendors staked their competitive positions. Now in early 2020, an acquisition in an adjacent technology may be signaling further changes ahead for another hot technology in the enterprise.

Low-code development platform vendor Appian has acquired startup Novayre Solutions SL, developer of the Jidoka robotic process automation platform. In announcing the deal, Appian said that it now makes the company a “one-stop shop for automation, with best-in-class solutions for workflow, AI, and RPA.”

That’s something that many vendors and enterprise companies are pursuing. RPA and artificial intelligence are technologies that organizations often will want to put together to automate tedious repetitive tasks. Indeed, analyst firm Gartner named hyperautomation as one of the top 10 strategic technology trends for 2020, saying that the No. 1 use case for artificial intelligence is automation. Putting AI together with RPA can streamline operations and make organizations more efficient. It’s another step toward achieving the digital transformation that all organizations are pursuing.

Appian’s plans for the acquisition are very much along these lines. The company said that it plans to unify low-code development and RPA into one comprehensive automation platform that enables “the orchestration of all three agents of modern work — humans, bots, and artificial intelligence.”

But the acquisition won’t exclude technologies from other RPA vendors. Appian said the platform will deliver RPA governance for the enterprise that will enable management of robotic workforces from the major RPA vendors including Blue Prism and UIPath. That includes monitoring, scheduling and reporting. The service will be available on the Appian cloud.

“Appian’s goal is to be a one-stop shop for Digital Process Automation (DPA) and RPA, and to build out a more complete Intelligent Automation (IA) platform, an increasing  need, as enterprises begin to scale automation initiatives,” wrote Forrester principal analyst and VP Craig Le Clair in a blog post.

Forrester defines digital process automation as dealing more with larger processes while RPA deals more with single tasks.

“RPA remains at the very center of many of these,” Le Clair wrote. “Appian is less likely to go head to head with the top RPA platforms but will look first to add RPA to their existing customers or find opportunities where DPA and RPA are both valuable.”

Le Clair also noted that for Appian, this is a technology acquisition. Jidoka’s architecture is Java-based and runs on Linux, plus it is containerized and cloud-native…[…] Read more »…..

 

The History of Security and the Fight to Protect Ourselves

While almost everyone in modern industry has heard and thought about cyberattacks, breaches, data compromises and defenses, cyber warfare pre-dates the modern computing era. As far back as 1976, when I started my first job in astrodynamics working on Air Force satellites, security was an important consideration–decades before the Internet and our powerful computing devices.

By Michael Miora, SVP & CISO, Korn Ferry

The security story I want to share begins in the late 1970s. As a young UC Berkeley graduate, my attention was on mathematics and getting a job! I never imagined that I would focus on security for the next few decades. I never envisioned myself as a critical decision maker, whose actions would affect the course and success of a multi-billion-dollar, global enterprise.

Understanding this security story will help us all be better at identifying what needs to be protected and how we need to define and design our protections.

With a background in mathematics, I opted to study Satellite Orbit Calculation and Manipulation during my first job. However, my attention was quickly captured by the need to protect the information assets of the 70s against our adversary, the then-Soviet Union.

Today, it is obvious that we need to encrypt the large amount of data coming from satellites and going to ground-based receivers. In the 1970s though, such encryption and protection was beyond the capabilities of the small and low-powered satellite computers. Therefore, we needed to solve this problem using innovations that would use the capabilities we had at our disposal.

Scientists in the early satellite industry designed a process of commutation and de-commutation of data; this was an accidental security design. By having each bit of a downstream represent specific information known only to the receiving equipment, we had a de facto secret required to understand the data.

The Major Transformation

In November 1988, we experienced the first major, though ostensibly unintentional, attack on the ARPANET, the predecessor to the Internet. It was the “The Morris Worm,” which exploited known vulnerabilities very similar to those that still plague us today, including weak passwords, lack of filtering, and trusting outside networks without controls.

At the time, I was working for a major defense contractor that was affected by this worm. We formed a rudimentary team to study the attack and plan how to respond to the future attacks we already knew were going to come. Today, we call this a Security Incident Response Plan!

By the end of the eighties, I gathered all the experiences that I gained from my satellite and defense work to launch InfoSec Labs, one of the pioneering security consulting firms that focused on helping major financial, healthcare, and manufacturing companies protect themselves. I thereby entered an environment where my advice needed to be presented and then sold to clients as reasonable and justifiable actions. We all know how difficult it is to convince top management to spend money on intangible rewards and returns. It was challenging but rewarding to provide advice, help implement that advice, and then witness the result.

We built InfoSec Labs from the ground up without external funding because the Venture Capital firms had not yet fully grasped the importance of security or the role it would play in the coming years.

The Holy Grail: Anti-Virus

“I Love You!” Sound familiar? For those who were using email and Microsoft Word in 2000, you probably know the impact of this virus. This was one of the first major and wildly successful attacks in the history of computing, with far reaching effects that dwarfed the Morris Worm. It was very innovative because it was the first use of embedded macros in a trusted program, perverted to malicious use, and it embodied all the “features” of our modern viruses.

It was at this time that I was approached by some well-established security companies. The reputation of my company and my team attracted their attention, and my firm eventually was acquired by Rainbow Technologies, a major, publicly traded security company.

There were already anti-virus programs and systems available, but this helped spur quicker and more widespread implementation of these protections across industries and companies worldwide. The evolution of anti-virus quickened and increased in its sophistication. So did the attackers.

Over the coming years, there were many and varied attacks, ever increasing in their sophistication. Even last year, in 2018, we saw new forms of attacks that recognized the improving protections and worked to circumvent them. Some of those used normal-looking software that launched and encrypted systems (“ransomware”). Others used stealth methods that did not use files to attack and take over systems; still others used other advanced techniques.

Today the original anti-virus has transformed into anti-malware and Endpoint Detection and Response (EDR) which include sophistication unimaginable even a few years ago, with storage of data and interactions requiring terabytes of storage. Cloud strategies along with global regulations and compliance requirements have made us smarter and caused us to work harder. We all know that compliance does not drive security, but smart security achieves compliance and protects us against the attackers.

Are We There Yet?

In 2017 and 2018, every U.S. voter was compromised. Every Hong Kong voter was compromised. Over the past two years, every U.S. adult has had their credentials and credit information compromised (300 million last year). The European Banking Commission has mandated that all banking compromises in EU be reported to them immediately. Twenty-five percent of all Australian companies were compromised last year.

The attackers work just as hard as we do, sometimes with significantly greater flexibility and resources. Often, these resources and protections are provided by nation states that provide immunity from capture and prosecution. It is our job to coordinate better with each other, to share information and to jointly find newer and better ways to protect ourselves. Let’s not be bashful in telling our vendors what we want and suggesting collaboration and cooperation among competitors and complementary products.

I do that with some success. Though the vendors don’t always follow the advice, their attention shifts to include that thinking.

The Goal of Availability

There is a creative tension between meeting security requirements and achieving business goals. Security is not just technical security; it means working securely and with recognition of required operational security considerations. Business goals require a significant dedication to customer service, translated to keeping systems and applications up and running nearly all the time…[…] Read more »…..

This article first appeared in CISO MAG.

<Link to CISO MAG site: www.cisomag.com>

Legislation Passed to Build National 5G Strategy & Protect US Telecommunications Networks

The U.S. House of Representatives has passed a bipartisan bill that would build a national strategy to protect 5G telecommunications systems in the United States and among U.S. allies.

According to a 2018 North Atlantic Treaty Organization report, Huawei’s growing influence as a leading supplier of 5G technology could be exploited by China to engage in espionage, monitor foreign corporations and governments, and support Chinese military operations. In November 2019, the Federal Communications Commission placed greater restrictions on Huawei and fellow Chinese tech firm ZTE due to widespread security concerns. However, the United States still lacks a comprehensive strategy.

The legislation, led by U.S. Representative Abigail Spanberger, is titled Secure 5G and Beyond Act. It would require the administration to develop an unclassified, national strategy to protect U.S. consumers and assist allies in maximizing the security of their 5G telecommunications systems. The strategy would also identify additional ways to spur research and development by U.S. companies in a way that maintains reliable internet access. Spanberger introduced th bipartisan legislation in May 2019 alongside U.S. Representatives Susan W. Brooks (R-IN-05), Tom O’Halleran (D-AZ-01), Francis Rooney (R-FL-19), Elissa Slotkin (D-MI-08), and Elise Stefanik (R-NY-21).

“The United States needs to be proactive in preventing any vulnerabilities that could be exploited by our adversaries. In our increasingly interconnected world, that means protecting our telecommunications and infrastructure, and those of our allies, from malign foreign interference,” said Rep. Rooney. “Today’s passage of this critical bill, which I was honored to cosponsor, will assist in ensuring the safety, security, and freedom of the United States and in safeguarding our technology infrastructure.”

“I’m proud to help pass this important bill to provide clarity and inter-agency strategy to secure 5th generation and future-generation telecommunications systems and infrastructure across the United States,” said Rep. Stefanik. “Ensuring the United States remains a leading global competitor in both the economy and technology is critical to the future of our nation. This bipartisan legislation requires the President to implement a strategy to secure these systems and maximize their security. I look forward to the bills implementation, and to protecting the competitiveness of American companies and the privacy of American consumers.”

The legislation passed in the U.S. House is the companion legislation to a bill introduced in the U.S. Senate by U.S. Senators John Cornyn (R-TX) and Richard Burr (R-NC)…[…] Read more »….

 

Cybersecurity Weekly: Colorado BEC scam, CyrusOne ransomware, new California privacy law

A town in Colorado loses over $1 million to BEC scammers. Data center provider CyrusOne suffers a ransomware attack. California adopts the strictest privacy law in the United States. All this, and more, in this week’s edition of Cybersecurity Weekly.

1. California adopts strictest privacy law in U.S.

A new privacy rights bill took effect on January 1, 2020 that governs the way businesses collect and store Californian consumer data. The California Consumer Privacy Act mandates strict requirements for companies to notify consumers about how their data will be used and monetized, along with offering them a hassle-free opt-out process.
Read more »

2. Starbucks API key exposed online

Developers at Starbucks recently left an API key exposed that could be used by an attacker to access the company’s internal systems. This issue could allow attackers to execute commands on systems, add/remove users and potentially take over the AWS instance. The security researcher who reported the incident to Starbucks was awarded a $4,000 bounty.
Read more »

3. Cybercriminals filling up on gas pump transaction scams

Gas stations will become liable for card-skimming at their pay-at-the-pump stations starting in October. In the meantime, cybercriminals are targeting these stations with a vengeance, according to security researchers. This is because pay-at-the-pump stations are one of the only PoS systems that don’t yet comply with PCI DSS regulations.
Read more »

4. Travelex currency exchange suspends services after malware attack

On New Year’s Eve, the U.K.-based currency exchange Travelex was forced to shut down its services as a “precautionary measure” in response to a malware attack. The company is manually processing customer requests while the network stays down during the incident response and recovery process.
Read more »

5. Xiaomi cameras connected to Google Nest expose video feeds from others

Google temporarily banned Xiaomi devices from its Nest Hub following a security incident with the Chinese camera manufacturer. Several posts on social media over the past week have showcased users gaining access to other random security cameras. Google warned users to unlink their cameras from their Nest Hub until a patch arrives.
Read more »

6. Colorado town wires over $1 million to BEC scammers

Colorado Town of Erie recently lost more than $1 million to a business email compromise attack after scammers used an electronic payment information form on the town’s own website. They requested a change to the payment information on the building contract for a nearby bridge construction project.
Read more »

7. Maze ransomware sued for publishing victim’s stolen data

The anonymous hackers behind the Maze ransomware are being sued for illegally accessing a victim’s network, stealing data, encrypting computers and publishing the stolen data after a ransom was not paid. Lawyers claim the lawsuit may be to reserve their spot for monetary damages if money is recovered by the government.
Read more »

8. Landry’s restaurant chain suffers payment card theft via PoS malware

A malware attack struck point of sale systems at Landry’s restaurant chain that allowed cybercriminals to steal customers’ credit card information. Due to end-to-end encryption technology used by the company, attackers were only able to steal payment data “in rare circumstances.”..[…] Read more »….

 

 

Meet Jean O’Neill: Cloud Expert of the Month – January 2020

Cloud Girls is honored to have amazingly accomplished, professional women in tech as our members. We take every opportunity to showcase their expertise and accomplishments – promotions, speaking engagements, publications and more. Now, we are excited to shine a spotlight on one of our members each month.

January Cloud Expert of the Month is Jean O’Neill
Jean O’Neill, Vice President of Channel at Cyxtera, has over 20 years of technology industry experience with deep expertise within Channel.  She was instrumental in building the Channel Programs at Rackspace, Terremark/Verizon, and Involta.  Additionally, she has counseled, mentored and supplied Channel consulting to companies in their initial stages, laying proper foundations to insure successful Channel/Alliances programs.  O’Neill is an early and active member of Cloud Girls who are advocates dedicated to educating themselves and their stakeholders about the vast and dynamic cloud ecosystem.  She is a Magna Cum Laude graduate of Kennesaw State University.

What do you value about being a Cloud Girl?
The sorority and support of women who I hold in such high esteem.

What advice would you give to your younger self at the start of your career?
You may not have chosen the easy path, but you chose the right path.

What is the best professional/business book you’ve read and why?
“Now, Discover Your Strengths” by Donald O. Clifton & Marcus Buckingham because it greatly contributed to the realization that my time and efforts are best used sharpening my strengths while identifying others to utilize their strengths to augment mine..[…] Read more »…..

 

The Most Important Agile Trends to Follow in 2020

The new year promises big changes in Agile methodologies and applications. Here’s a look at what to expect.

The Agile concept has come a long way over the past several years, taking its place as a firmly established set of software development methodologies. Yet Agile continues to evolve and mature, as experience leads to improvements, refinements, and new uses.

Looking forward to 2020, Scott Ambler, vice president and chief scientist of disciplined Agile at the Project Management Institute (PMI), a global nonprofit professional organization for project management, expects Agile to grow even more rapidly. “Agile isn’t just a trend; it’s here to stay, especially as we better learn how to effectively yield its benefits,” he said.

The new year also promises an increased focus on agility at scale, as well as better ways of integrating business delivery into engineering teams, predicted Casey Gordon, director of Agile engineering at Liberty Mutual Insurance. “Where, before, application development teams were handed work from upstream business-centric efforts, there’s a shift to move that work closer to the development teams to help reduce lead time,” he explained. “This may mean organizational alignment and structure changes, while portfolio management and executive teams that were previously planning through longer cycles will see those cycles reduced to quarterly or even monthly,” he added.

Steve Myers, managing director of Accenture’s Industry X.0 emerging, connected and smart technologies practice, noted that enterprises are also prepared to externalize more development capabilities. “Doing so will overturn the traditional approach, where all physical development was carried out in-house,” he stated. “After all, wherever else specialist expertise is required, such as in law or accounting, businesses turn to external experts for their toughest challenges, so why not for development of innovative products?”

SAFe at home

Gina Casamassima, vice president of the federal health division of management consulting firm Apprio, sees Agile moving toward more formal structures, such as Scaled Agile Framework SAFe 5.0. The changeover won’t be easy, however. “Corporations and government entities know they need to transform, but there are challenges in incorporating old managerial styles and structure into SAFe,” she warned. “SAFe 5.0’s focus on business agility requires that business and technology leaders are in sync with development and IT operations,” Casamassima added that business and management leaders should also utilize lean and Agile practices to ensure that the enterprise remains competitive and innovative.

Beth Phalen, president of Dell Technologies’ Dell EMC Data Protection unit, believes there will be greater recognition of Scrum@Scale in 2020. “Business leaders will more aggressively seek out how they need to change as they, too, embrace the Scrum simplicity and power,” she said. As for coders, they will need to keep their product in shippable product quality, always at scale. Coders will also increasingly find themselves collaborating and validating well beyond the confines of their Scrum teams. “They will need to work closely with business leaders on the right methods and investments to shift over to Scrum@Scale,” she explained.

The upcoming year should also see growing adoption of larger-scale Agile initiatives. “It’s been a slow burn, but the idea of extending Agile from individual Scrum teams to large-scale programs is really gathering steam,” observed Oliver Merkle, delivery director at Agile software developer Nexient. “Organizations that have adopted small-scale Agile have seen the benefits, and they’re comfortable with the basic principles,” he observed. The next big leap, he noted, will be to apply those principles at scale.

Agile for all?

Recent years have seen Agile practices flow out of IT and into various business departments, leading to Agile Engineering, Agile Human Resources and Agile Marketing organizations, among others. This expansion is likely to continue at an even faster pace in 2020. “We will see the original values and principles of Agile software development applied to new departments, like Agile Customer Service,” said Scott Abate, Agile project manager at digital business solutions provider Anexinet. “These new adopters will undergo transformations and reap the benefits of agility and adaptability in their own dynamic business environments,” he predicted.

Enterprise Agile is reshaping the way enterprises do business, Merkle observed. “Agile is the best way anyone has found to create products that serve customers and drive business value,” he said. “The principles of constant feedback and rapid iteration are just as valid at the enterprise scale as they are for smaller organizations.”..[…] Read more »…..

 

Removing the Human From the Machine Can Doom Cyber Resilience

There is universal acceptance of the need to be cyber threat resilient—anticipating, preparing for and responding to events and adapting these efforts to continuously changing threat profiles. Creating the security-minded organizational culture needed to achieve resilience remains elusive. One challenge is that the human elements of commitment, collaboration and education are often overlooked. If your cyber risk management efforts remove key human elements from the “machine,” you might accomplish compliance but not resilience.

Kurt Lewin, the father of modern social psychology, put it best: “If you want truly to understand something, try to change it!” Below are three key “resilience killers” from lessons learned over years of working to change organizational mindsets to establish resilience. These are behaviors you should strive to avoid when maturing your cybersecurity capabilities.

  1. Lack of commitment. Many organizations address resilience as a stand-alone goal, compartmentalizing cyber resilience as a network management priority and moving it down the list past revenue and profitability, growth and acquisition, cost control and talent strategy. Leadership needs to recognize that cyber resilience is an underlying element that supports all business priorities. Technology solutions need to connect to the people, processes and protocols that drive business. The impacts of a cyber event are not siloed in one area of the company. Direct costs (forensics, legal fees, compensation for personal data compromise, theft of financial assets), operational costs (systems and service delivery disruptions) and cost of decreased customer confidence all result in lost time, productivity, revenue and possibly executive jobs across lines of business.
  2. Static risk management. Intending to manage risk proactively is of little use if your organization cannot let go of “our way” or “the way it’s always been done.” Being dynamic requires agility – the willingness to change quickly and efficiently to meet emerging threats and think differently about your risk environment and security profile. Companies become static when they define strategies based solely on subjectively measured risks coming from independent operating units and fail to incorporate how the executive team looks at overall risk. Executive risk assessment of core functions should be paired with traditional business impact analysis at the process level, putting the greatest focus on the areas deemed the highest risk by senior leaders. This top-down approach creates an opportunity for IT to educate the business on how the application of technology addresses risk and enlightens IT leaders on when to tighten/loosen specific recovery objectives to satisfy business requirements…[…] Read more »….

 

Multi-Cloud Security Is the New #1 IT Challenge for Businesses

Most businesses now have an IT infrastructure that makes use of multiple cloud services providers. A new study from Business Performance Innovation (BPI) Network finds that multi-cloud security has become the biggest immediate IT challenge for businesses, as the authorization and authentication handoffs between these different services provide ample opportunity for things to go wrong.

Mapping multi-cloud architecture with BPI

The mass movement of businesses to a multi-cloud provider model can be traced back to a number of things: a desire to not be locked in to one vendor’s products, lack of necessary tools from a single vendor (or that vendor not offering those particular tools at a competitive price point), and network improvements such as lower latency and downtime.

There is, however, a widespread errant belief that somehow a multi-cloud setup is inherently more secure. This can be true, but only if sensitive data is exclusively stored on and accessed from a private part of the cloud that is properly monitored and managed by IT staff. What tends to happen in reality is that these disparate cloud components end up being difficult to integrate and train company personnel on. This leads to all sorts of mishaps, from misconfigured storage buckets being breached to vendors being given access to a much higher level of sensitive data than is required.

These are some of the themes seen in BPI’s “Mapping the Multi-Cloud Enterprise,” a survey of the multi-cloud security practices of 127 business and IT decision-makers at a mix of international companies of varying sizes. The survey revealed that 8 out of 10 businesses have implemented a multi-cloud infrastructure, and just over half of these have moved more than half of their applications to the cloud. Over the next two years, 84% expect to increase their use of public or private clouds and only 2% planned to decrease their use. 52% are planning to incorporate additional cloud services in the near future, with only 13% ruling out the possibility.

Though these businesses seem to almost universally be shifting to a multi-cloud approach, only 11% rated their transition as “highly successful.” Multi-cloud security is the #1 issue cited. These companies reported difficulty in juggling all of these cloud services, finding and training personnel capable of securely managing them, troubles with automation and performance, visibility issues and issues with scaling among their central problems.

63% of the companies named multi-cloud security as one of their top challenges. Specific security needs were led by centralized authentication (62%), centralized security policies (46%), web application firewalls (40%) and DDoS protection (33%).

Only 9% of the companies surveyed reported being “extremely satisfied” with the current state of their multi-cloud security. The vast majority of respondents (82%) were either currently re-assessing their security and cloud services suppliers or are at least considering such an evaluation. The majority (51%) reported either only being “somewhat successful” with their cloud implementation or entirely unsuccessful.

Two-cloud (36%) and three-cloud (17%) setups are the most common multi-cloud configurations. About 10% of respondents have adopted four or more cloud services as part of their digital transformation.

Multi-cloud security: Incompatible with complexity?

Complexity and security are two concepts that are always inherently tough to reconcile. As Dave Murray, BPI research chief put it: ““IT and business leaders are struggling with how to reassert the same levels of management, security, visibility and control that existed in past IT models.”

The most common multi-cloud security issue is the potential for misconfigurations, and the temptation to simply weaken authorization processes to make sure everything moves from one app to another smoothly. The more disparate cloud components added, the more that visibility also becomes a problem. This is often the reason that unsecured data buckets are found and breached online.

Another underlooked (but potentially serious) issue that impacts enterprises worldwide is the increased burden of regulatory and legal compliance expenses, particularly in regions such as the EU that require extremely detailed data tracking and reporting. Even if a company deploying multi-cloud is not strictly required by law to have a data protection officer, they may find a need to create such a position (or even a team) simply to track compliance issues and data requests as the cloud architecture expands.

The increased possibilities of “shadow IT” and unauthorized access also need to be accounted for in any multi-cloud security plan. Frustrated by inability to get different services working together, staff may simply create insecure workarounds. Vendor compromise and third-party data breaches have also been in the news recently just as much as unsecured Amazon S3 buckets have; the cause of this is often simply handing third-party partners too much access to circumvent having to navigate complex or non-functional authentication procedures…[…] Read more »

 

Malware spotlight: Badware

Introduction: What is badware?

Malware, as the name indicates, is malicious software designed to cause damage to computer systems and networks. Badware is often used as a synonym of malware, but in reality, there are some subtle differences between the two terms.

While malware is an umbrella term that covers a variety of malicious codes including viruses, Trojan horses, ransomware and backdoors, badware is not necessarily software created to destroy systems. In fact, it is often simply used to collect users’ information for a variety of purposes.

In some cases, “users may treat badware infection as an annoyance to be dealt with rather than a threat to their (or their company’s) data and computing resources,” says StopBadware, Inc., an anti-malware organization created in 2006. This nonprofit makes an effort to cleanse websites that are tagged as spreading badware by maintaining a catalog of sites that have been reported to distribute badware and continues to warn consumers about “this kind of attack [that] takes advantage of a vulnerability or ‘hole’ in your web browser, a browser plug-in, or other software on your computer.”

Badware, of course, can be also used by cybercriminals to hack or socially engineer a target and eventually use that info to attack with other types of malware options.

What problems can badware bring?

Badware can be bad news for both webmaster and users. This is because it is software that is able to somehow bypass the intended use of a website or connection to arrive to a certain scope. For users, this means a number of issues.

In the best-case scenario, badware is intrusive and designed mainly to track a user’s moves online to feed information to advertisers and marketing groups. The user will be unknowingly releasing information on his or her browsing or shopping habits through the use of research software or toolbars designed for the scope, or will be stuck with the installation of a secondary, unwanted program when installing a program of choice.

In the worst-case scenario, malware/badware will lead to compromise of sensitive data (like passwords or financial info), serve as a means towards attacking other computers or trick users into buying items and services. A typical purchase scam is the banner that pops up, warning the user that the computer is running slow and needs to be defragged. This prompts the user to download a specific, often infected, piece of software.

Webmasters can be equally affected by badware turning their legitimate website into a repository of malicious software. This is obviously a blow to the reputation of the site and can result in great loss of viewers and clients.

Is badware a growing problem?

Specific data solely on badware is not currently available, but it’s worth noting that this malware threat was already getting attention a decade ago. In fact, StopBadware.org’s May 2008 Badware Websites Report produced the following findings:

Types of badware

The three most common types of badware behavior are:

  • Malicious scripts: Used to redirect website visitors to a different site or to load actual badware from another source
  • .htaccess redirects: A hidden server file used in Apache web servers that can be compromised by malicious attackers to redirect users to badware websites
  • Hidden iframes: A section of a web page that loads malicious content from another page or site, without the visitor’s knowledge

Cybercriminals can also infect computers with badware using drive-by downloads, which is a common method of spreading malware that occurs when a website automatically (and often silently) installs malicious code (usually an exploit kit) onto the victim’s PC — without the user being aware. No clicking is necessary with this kind of attack, which can take advantage of a vulnerability in a web browser, a browser plug-in or other software on a computer to infiltrate the system and take control of it.

How to prevent badware

First of all, it is important to keep a watchful eye and try to identify badware. For example:

  • You see a warning from the antivirus software when visiting the site that displays a browser warning, such as “Reported attack site” or “This site may harm your computer”
  • The site redirects to an unknown domain when you navigate to it in your browser
  • You notice that permissions or files have been altered, or new users have been added

Webmasters, in particular, need to be aware and check if any search engines redirect users heading to their sites to different URLs or if the same happens while navigating within the site.

Badware can be difficult to avoid, as it can be slipped in a system via vulnerabilities or by exploiting users’ behaviors. There are a number of things, however, that can help you counteract this threat:

  • Keep website software updated with the latest security fixes. This can patch loopholes that can let badware into the computer where a hacker can steal passwords and/or modify the contents that a user has uploaded…[…] Read more »….

 

Cloud Solutions: Four Key Areas of Focus

When it comes to cloud solutions, there are many questions regarding the migration process. To help with the transition, end users need to have a full understanding of what cloud is and what they would be getting. The security industry is conservative and can be slow to make changes, however it’s not a question of ‘if’ you might transfer to cloud, but ‘when.’ Moving to cloud starts with an understanding of operation and functionality. Education and research come next, honing in on what you are looking for and what a cloud solution can offer to meet your physical security requirements.

To help with the transition, what are a few key elements to focus on, when starting to have a conversation about cloud? In an effort to provide guidance, these four cloud facts should help in deciding if a cloud solution is best for you.

1. Cloud Provides Cost-Saving Solutions

Eliminating local system components makes the cloud less complicated, but it also makes system maintenance more cost-effective overall. Transitioning to the cloud will save you money in both the long and short term. A subscription-based model will save on setup and management fees, as well as human resources. With a cloud solution, because you are eliminating the need for NVRs, video management systems, access control servers and panels, you will be eliminating four major expenses.

Additionally, cloud provides a more scalable solution, which means you will be able to add a few cameras or doors to an existing system without having to purchase additional hardware. This allows for a more flexible approach to your security needs – meaning you only pay for what you use. You will no longer have to pay for an NVR that supports 60 cameras, while only having 30 installed.

2. Cybersecurity Risks Are Alleviated

You might think because you’re connecting a device to the cloud, to something outside of your network, that it poses more security risks than on-premise storage. However, cloud providers have been working for many years to develop technology to encrypt the data from the moment it’s captured through the transportation to the cloud. This gives you end-to-end security, which is safer than an onsite system. In addition, onsite systems can be prone to damage through theft, server issues and hacking – which are not issues with a cloud system.

When choosing a cloud provider, look for organizations with experience and resources that invest significantly in cybersecurity annually. Consider asking how much a provider will spend on cybersecurity and data privacy every year, as well as if they have a designated cyber team and how big that team is. Your cloud provider will likely be able to invest significantly more into cybersecurity development than you might be able to on your own.

3. Cloud Requires Minimal Bandwidth

The cloud platform you select should run on a single, open architecture to deliver faster, more secure and more reliable services. What you might not know about cloud-based systems is that they can operate without cloud-based video storage. A robust and open platform will allow video to be stored on the camera, or gateway device – enabling the system to run using little to no bandwidth. If you choose to store video on a camera or gateway device, you can transport it over the network on-demand.

You should also look for a solution offering a hybrid mode – a combination of on-device and cloud storage. Hybrid mode is customizable to your needs and allows for video to be stored on the cloud, or locally on a bandwidth schedule. At the end of the day, cloud storage offers a number of options, all requiring little bandwidth.

4. Cloud Requires Less Components

Simply put, cloud solutions are easier and less complicated than onsite systems…[…] Read more »….