Posts

Considering Compliance in the Cloud

Gates Marshall is Director of Cyber Services at CompliancePoint. He has many years of experience in information security consulting with expertise across secure architectural design, vulnerability and penetration testing, OWASP, forensics, incident response, GDPR, FISMA, MARS-E, and cryptographic control design and implementation.

OPAQ: What exactly do we mean these days by “cloud compliance” versus other security and compliance topics?

GM: In some respects, there is not a big difference between on-premise and the cloud. HIPAA or PCI standards don’t make special exceptions for the cloud. The rules apply the same everywhere. There are also some cloud-specific compliance solutions out there like CloudeAssurance or CSA Star Certification, which allow organizations to achieve a quantifiable rating on compliance. Yet for a lot of things, being compliant in the cloud is not much different than having a data center somewhere or a colocation provider.

A significant problem is that when people sign on with a cloud service provider (CSP), they sometimes think they are outsourcing the due diligence aspect of compliance. Google, Microsoft and Amazon have a number of certifications, but these are to certify their own services. They are not certifying that their merchants and other customers are compliant in any specific client-level implementation.

OPAQ: There are some differences, though, right?

GM: The way you can configure systems in the cloud is different than a traditional on-premise installation. For instance, take PCI DSS, which is a fairly prescriptive standard for merchants. It calls for having a separate demilitarization (DMZ) zone from your LAN to isolate and protect credit card data with a firewall. CSPs may support other mechanisms, like AWS security groups, to facilitate a similar functionality; however doing so still doesn’t meet all of the compliance requirements for a DMZ.  So organizations are using these new cloud services, but they are missing some of the requirements as relates to architecture controls and/or logical segmentation.

OPAQ: How would you describe the level of security and compliance support at the major cloud providers?

GM: They do quite a bit to reduce the burden of compliance. Most of them produce good documentation to declare what we call a service provider controls responsibility matrix.  It shows what the provider is doing around compliance and that helps because it both reduces the burden on the customer and declares where the customer’s remaining responsibilities begin. Security at the large CSPs has improved a lot, for instance with services like Amazon CloudWatch for monitoring. All the major providers now have good auditing capabilities for the management interface and offer multifactor authentication. These developments give customers more confidence in the cloud.

OPAQ: Is security protection in the cloud as good as or better than an enterprise on-premise environment?

GM: We tend to have an affinity toward legacy configurations in the on-premise world.  By that, meaning we set it up and it works and we never change it. It’s security via obscurity. When you go through the transformation process to become a cloud-first organization, you need to fix all those legacy issues that were acceptable in the LAN environment. You can’t be so sloppy. Cloud providers may be less secure than on premise, however, because you’re letting someone else manage the Layer 1 infrastructure. The physical addressing and networking and storage configurations now fall on the CSP. They may have weaknesses that you don’t know about and the customer has to depend on third-party attestations. Hypervisor hopping has been a concern for a while. If a CSP’s hypervisor technology has a flaw, a malicious actor could jump between different customers’ VM guests through the hypervisor. There aren’t any disclosed examples of this happening, but it’s always a risk in a multi-tenant environment.

OPAQ: Yet most if not all of the massive breaches in recent years have been in on-premise environments, right?

GM: While this is true, many of these breaches could have taken place in the cloud. Equifax had a real problem with inventory because they didn’t have visibility into the software that should have been patched. That scenario could have also occurred with a CSP. Vulnerability management is critical in any implementation. Accenture did have an issue in the cloud recently, which could have been disastrous. In October, it was discovered that the global consulting firm had left an AWS S3 storage location unsecured, leaving over 100GB  of customer data accessible without authentication by anyone on the Internet with the correct S3 URL.  The insecure configuration of Amazon S3 could also apply to on-premise technologies.  No matter where your data sits, IT needs to secure the location against exploitable configurations and software flaws.

OPAQ: Do you foresee more regulation in the area of cloud compliance and security?

GM: Yes. The EU’s General Data Protection Regulation (GDPR) has huge potential to change a lot of things in tech. It goes into enforcement in 2018, and may become a global standard for privacy. GDPR applies to any organization that uses the data of people who are in the EU at the time of data collection. Two key principles of GDPR are that companies and organizations should use data minimization to keep the smallest amount of data possible and use consent mechanisms to ensure they’re authorized to hold or use that data. If you have 10 million customer records, but determine that you only need to keep two million records and purge the rest, your risks go down. If a breach occurs, there is less data loss and lower costs to mitigate the impacts of the loss. Information privacy is the next frontier. The large CSPs realize that if they don’t get in front of this, they will lose business. This will require that CSPs look closely at the leading cyber risk rating mechanisms, and adopt one or two of them. I think we’ll also see more CSPs provide guidance on how to meet global data security and privacy requirements in an effort to help customers help themselves.

IoT Systems are Complex, and so is Securing Them

Brian Russell is Chief Engineer, Cyber Security Solutions at Leidos.  In this role, he defines and implements cyber security controls for Internet of Things (IoT) and cloud products and systems. Russell is the co-author of “Practical Internet of Things Security” and is Chair of the Cloud Security Alliance (CSA) IoT Working Group.

OPAQ: How do security risks for IoT devices and applications differ from mobile security or web app security?

BR: Some of the risks related to IoT devices are similar to risks we’re already familiar with, such as those identified by the Open Web Application Security Project (OWASP): security misconfigurations, sensitive data exposure, using components with known vulnerabilities, and privacy risks.  Where we run into differences compared to mobile and web app security relates to the physical nature of IoT devices, acquisition and deployment models for IoT devices, enablement of automation across IoT devices and privacy associated with IoT devices.

For example, we might see IoT products deployed across a city such as smart parking meters or road-side units (RSUs).  These devices need comprehensive physical protections built into them to prevent theft and extraction of firmware for further security analysis.  It’s also important that access controls for these devices are explored thoroughly.  We’ve already seen plenty of scenarios where product makers have used shared credentials across a family of devices.  These configurations make it unnecessarily easy on malicious actors.

The IoT is similar also in some instances to the concept of BYOD in that employees or customers may bring connected products, such as smart watches into the organization.  Or, employees might install smart TVs on corporate networks, and those devices could send data out to the manufacturer.  Security teams need to be on the lookout for these connected devices and make sure that they don’t open avenues to export company data to the outside.

As relates to new acquisition models, a company may decide to lease an expensive connected asset instead of purchasing it. Often, the asset is remotely managed by the vendor.  This opens new interfaces to the organizations’ networks that must be locked down.

OPAQ: What are the top enterprise risks from IoT?

BR: First, it’s useful to understand the core ways that enterprises are using IoT data. We are seeing that manifest in two ways:  the IoT device feeds data into analytics systems that companies rely upon for decision making purposes and secondly, the IoT systems could enable automated decision making within control systems, such as sensors that collect system status data to decide whether to continue or stop a running process.

From an analytics perspective, we must protect against data tampering.  If we do not have confidence in the provenance of the data then decisions made based on that data must come into question.  So, we must apply lifecycle security protections to the data to enforce data integrity. This can be accomplished through cryptographic hashing algorithms for example.  Organizations that collect sensitive data from individuals must not only protect it such as with encryption, but they must recognize that they are collecting sensitive data in the first place. If for example, you’re collecting blood pressure data from your patients, that piece of data alone isn’t necessarily sensitive.  But, when combined with identifying information, the aggregate data is subject to regulatory compliance rules.

If a malicious actor gains access to an IoT-enabled industrial control system, then they can cause unexpected physical actions to occur, which put the safety of the enterprise’s stakeholders at risk.  For example, by increasing the pressure in an oil pipeline, attackers could cause an explosion.  That’s why I usually like to recommend performing at least a rudimentary safety analysis for any IoT system being implemented.

OPAQ: Is security a barrier right now for the adoption of/broader potential of IoT?

BR: What is a bit concerning is that I don’t necessarily know that security is a barrier right now for the adoption of IoT solutions.  IoT-based innovation continues at a rapid pace, even in safety-critical industries.  Connected and autonomous cars are already on the road, medical devices are being connected, control systems are being connected, and the home /consumer IoT market continues to expand.  It seems that many of us are willing to take a chance on new technologies enabled by the IoT and then update those devices when we find that a security flaw has been discovered.

OPAQ: What kind of advice would you give IT departments regarding implementing IoT security plans – whether that’s from employees bringing in personal IoT devices and apps– or from the company having business IoT technology in place?

BR: First, sit down and think about what policies you might need to institute, such as what devices people can bring into a space and what they can connect to the network.  Also, keep track of IoT-related vulnerabilities and make sure to tune your detection processes based on what might be in use in your organization.  For organizations putting business IoT technology in place, make sure that you aren’t infringing on anyone’s privacy with these systems (e.g., conduct a Privacy Impact Assessment) and make sure that you aren’t jeopardizing the safety of users, either. Perform a threat model to identify the high value assets and the data flows within your system and lock them down appropriately.  Apply integrity controls to your data at all points within your systems.  Keep track of all of the IoT assets in your enterprise, which includes tracking the physical locations of your assets and the versions of firmware/software running on these assets.  And, of course, put a plan in place to keep all of your IoT assets updated.

What You Need to Do – Major Wifi Encryption Vulnerability

The vulnerability:

A set of significant vulnerabilities have been disclosed in the encryption of Wifi networks (specifically the WPA2 protocol). An attacker who is within range to connect to a Wifi network can exploit these vulnerabilities to completely decrypt traffic as well as manipulate or inject data. These vulnerabilities impact nearly every vendor of Wifi client software. The impact on Linux and Android devices is particularly severe.

How to mitigate:

The best way to mitigate these vulnerabilities is to install patches. The vulnerabilities impact multiple vendors, so CERT/CC is hosting a webpage with links to security advisory and patch information for each affected vendor. This page will be updated over time as new patches are released: http://www.kb.cert.org/vuls/id/228519

Deploying a second layer of encryption can be a useful mitigation while patches are unavailable. The simplest way to achieve this is to require users on Wifi networks to employ their corporate VPN clients while connected to Wifi. An ACL or firewall rule could be used to block traffic destined from the Wifi network to every destination other than the VPN.

Switching your Wifi network from WPA2 to WEP encryption is not advised as WEP has more significant security problems.

Learn more:

A detailed description of the vulnerabilities and the research surrounding them is available at this link: https://www.krackattacks.com

Briefly, the vulnerability impacts the WPA2 protocol. Part of the handshake for that protocol can be replayed to a client, causing the client to reuse an old encryption key. This key reuse can lead to effective cryptanalysis and decryption. In the case of Linux and Android devices, the encryption key can be reset to an all-zero key, with catastrophic consequences.

Meyer: Closing the Cybersecurity Skills Gap with Entry-Level Roles

ka_0011-2 Ean Meyer is a Course Director with Full Sail University, teaching the next generation of engineers about information security. He has experience in PCI, SOX, intrusion detection and prevent systems, information security program management, penetration testing, and social engineering/user awareness training. Ean has a B.S. in Information Security and an A.S. in Computer Network Systems.

OPAQ: What are a few reasons why security skills are lacking in the workforce?

EM: There are two main problems in the higher level discussion about the skills gap. We have focused too much on passing tests and not critical thinking and history and engineering. Information security is about thinking outside of the box: you have to think like a hacker. The second challenge is that academia has a tendency to be behind the curve. In some colleges you  have to pass electric engineering to get into the network security course. That’s a major barrier for people who could be excellent network engineers or security analysts. It doesn’t make sense. I am a big believer that the skills gap can be solved by a trade school and real world education approach. People aren’t going to enter the workforce into environments where it’s all  brand new technology, except for maybe at startups. In large organizations you’re going to have a lot of legacy technology, so teaching the history of that and learning how to deal with those challenges is part of the skills gap issue.

OPAQ: What skills are most needed now?

EM: The top one is security analyst. These are people who can come in and understand the environment quickly and provide value by teaching well-defined processes and when to escalate. There are lots of people from IT fields that know how computer infrastructure works and can be taught additional pieces of process they haven’t been exposed to yet. The second big one is cloud security architect. The cloud is not simply, push a button and it’s all good behind the scenes. For AWS, there are 1500 pages of security documentation. I’m also a big fan of understanding what is going on in social engineering—the con men just trying to trick people. I think security awareness training is a big opportunity. These trainers can help employees understand in plain language the real issues and how to protect themselves.

OPAQ: You recently wrote about a solution to the skills gap, involving the creation of entry-level security roles at companies. Tell us how this can work?

EM: One of the arguments is that you are not a security person unless you are a generalist at the peak of your career. But someone familiar with Microsoft tools could become a security champion. Let’s create roles where someone could evaluate a new vulnerability because they know all of the company’s IT systems. There could be new types of intern programs where someone could be in charge of real projects like patch management allowing them to learn and grow and stay on with the company. Interns are often brought on with no real goal. They aren’t learning or doing much and you aren’t getting much value from them. That intern could have a senior engineer overseeing their work and then you can grow the security workforce. You’ll also learn a lot because the person from the outside will see things you won’t see.

OPAQ: What kind of culture and processes are needed to support the in-house training and development of entry-level roles?

EM: The security analyst doesn’t need to program in C++. You can get a great analyst who can see the alerts on a dashboard and address them. They can learn how to code later, if needed. It’s not necessary to create an HR firewall requiring all these certifications and degrees to get a job in security. Job rotations are another idea. Someone who’s been on the database team for a few years could get invited to work on the security team for a few hours a week. That builds relationships and allows people to move more easily into a security role when there’s a need. I would also encourage directors to worry less about having to replace that database person and consider how that person is bringing institutional knowledge to a security role and can still be a resource to answer questions for the database team. We need to focus more on these cross-departmental relationships.

Acohido: Cyber-insurance is still nascent, yet worth a look

ka_0011-2 Pulitzer-winning journalist Byron V. Acohido is the founder and executive editor of Last Watchdog, a pioneering security webzine. One of the nation’s most respected cybersecurity and privacy experts, Acohido conceived and delivered a nationally-recognized body of work for USA Today, chronicling the frenetic evolution of cybercrime in its formative stages.

OPAQ: Some 32 percent of U.S. businesses purchased some form of cyber liability and/or data breach coverage in the last six months, compared to 29 percent in October 2016, says a survey by the Council of Insurance Agents and Brokers (CIAB). Do you think this growth will continue—and why?

BA: Demand for cyber insurance absolutely will increase at a healthy clip for the foreseeable future. That’s because the value of business data and intellectual property today far outstrips the value of the physical plant. Think about it: we can do astounding things with cloud computing and mobile devices. And yet the business networks that support Internet-centric commerce remain chock full of security holes. Criminals get this, and will continue to take full advantage. Meanwhile, businesses are scrambling to figure out how to deal with data theft, network disruptions and cyber fraud. And we are in the very earliest stages of dialing in insurance to help them offset these emerging exposures.

OPAQ: There are a number of barriers for purchasers of cyber insurance, including: lack of standardization on policies and pricing, difficulties determining risk, difficulty showing attribution when a breach or incident occurs, and so on. Thoughts on these and how should the insurance industry address them?

BA: There’s nothing, really, stopping the industry from taking the first step of standardizing the basic terminology to use in cyber policies. Right now there is none. Standardized language would pave the way for underwriters to begin more assertively partnering with cybersecurity vendors to come up with innovations to measure cyber risks. Insurers could become much more proactive about incentivizing companies to embrace more rigorous security policies and practices. As the pool of lower-risk policyholders grows, the industry could then begin to extend policies to cover specific cyber exposures that today are not routinely covered.

OPAQ: There is risk in buying cyber insurance in terms of mitigating losses. For instance, Target received an estimated $100 million in coverage, which didn’t even cover half of the $290 million it lost. How can companies avoid this sort of outcome?

BA: No company should be relying solely on insurance to eliminate all, or even most, cyber exposures. In the current environment, where hackers probe business networks 24 by 7 by 365, network security should be a top priority for all organizations. It’s a cliché, but true, that there is no silver bullet. The use of layered security technologies remains vital; no less so continually refining and enforcing policies and training employees. A cyber policy can then be thoughtfully purchased to offset the remaining risk.

OPAQ: Given these barriers, and any tips for CSOs seeking carrier quotes?

BA: It’s an interesting time to go shopping for cyber coverage. Even though the insurance industry has left many things undone, there is wide recognition of the pent-up demand. The result is that there are many companies competing aggressively to sell policies. In a sense, it’s a buyers’ market. Numerous options are available to get some level of cyber coverage from somebody. The problem, of course, is that the devil is in the fine print. So it is important to find a knowledgeable, trustworthy agent to guide you through the due diligence process.

OPAQ: Finally, what could security vendors be doing to help their customers with cyber insurance – a.k.a. data collection, navigating insurance decisions, partnering, etc.?

BA: The path forward for security vendors, at this point, seems to be much the same as insurance buyers – become knowledgeable about this emerging market and align yourself with smart, trustworthy partners. A few pioneering partnerships between insurance companies and security vendors are out there, and I expect this trend to accelerate over the next few years.

Consistency and Cost Savings from Cloud-Based Security

Bob Brandt is an information security expert, most recently as the Global Security Architect at 3M. While at 3M, he focused on integration efforts for 3M application services across cloud and mobile platforms. Bob also devoted significant effort to improving 3M’s malware protection capabilities. He was on the governing body for several Twin Cities CISO Summits and co-chaired the Twin Cities chapter of the Identity Management Meetup for several years. Follow Bob on Twitter: @bobbrandt.

OPAQ: Which cyber security threats seem to be foiling enterprises today, and the vendors that serve them?

BB: The human factor is still a weak point. There are improvements that could be made to phishing defenses, as that is one of the main channels through which these attacks are successful. Phishers only need a low hit rate to be successful. However, a cloud service can deliver a consistent way of looking at data from all the various usage patterns. For example, every app has a Web version and an app for mobile, and those are distinctly different deployment patterns. All the traffic, whether it comes from a WiFi or wired or mobile network goes through the same cloud service on its way to the application, and this enables companies to provide consistent security. It’s also more cost-effective to secure your applications through a cloud service instead of using several different technologies.

Another key threat where companies are falling down is regarding the privacy, governance and risk around data. If you had controls on the data it wouldn’t matter if someone stole the whole database, because they couldn’t crack open the encrypted data.

OPAQ: If you could start a company in the security industry today, what would be the focus?

BB: I’d probably work on a service that applied and enforced controls on data, such as authorizing people to access data and tracking that. For instance, in a hospital environment, the software would track data on who looked at patient data and when, because there should be very few people doing that. Even those who are authorized should have a reason for accessing your personal data. If the fields are naturally encrypted at the data layer, it would be hard for hackers to use it. Axiomatics and BigID are two of the companies working on this today.

OPAQ: Are there big differences in how midsize to large enterprises should approach security compared with smaller companies? Especially since smaller companies can still have large databases of sensitive information of value to hackers?

BB: First off, I’ll say that the cloud is a great equalizer. I think everyone should use cloud services for security. Large enterprises might have a few experts on staff to keep vendors honest and to customize the solution if needed. Smaller companies might rely more on a managed service provider as they don’t want to pay for IT staff, but on their own, they can’t keep up with changing security needs and threats. The differences are mainly on how to staff for security. The functionality is about the same, regardless of company size, and most of it should run in the cloud. Another advantage of the cloud is if you are running applications in an outside service, your business benefits from the traffic data of thousands of companies. An event like a single packet doesn’t mean much, but across all those companies it does. The cloud providers can see patterns from the data which can result in early detection of the threat.

OPAQ: Security skills are at a premium. How do you think companies should best handle this challenge moving forward?

BB: People still tend to talk mainly about firewalls and hackers, but that problem will be solved. In the future the skills will be less about malware analysis and more related to application security and integration, digital signatures, and connecting clouds securely. If we just built security into transaction APIs, the noise of malware would go down substantially. Increasingly, security is becoming automated. You can get a firewall administrator from a vendor’s solution.

There are threat analytics services which are largely automated and look for patterns in big data sets. These services can tell a customer when an attack might be coming—the kind of analysis that a customer would never be able to see just by looking at its own data.

From Russia to WannaCry, Bad Actors are Hard to Nab

ka_0011-2David Strom is editor of the email newsletter, Inside Security. He also consults to vendors on emerging technologies, products, strategies, and trends. Strom, formerly the editor-in-chief of Network Computing, has authored two books on the topic.

OPAQ: What are hackers looking for lately when it comes to attacks on business and is there a focus on particular verticals?

DS: Yes, any vertical where there is money. It’s all about whaling attacks and CEO phishing attacks. Any business that is successful is a target, which is scary. Malware is getting a lot sneakier, too. There are all sorts of ways to hide the attacks by using registry exploits, PowerShell and other things that make use of the internals of Windows infrastructure to elude detection. But even when malware authors aren’t using these techniques, their attacks are still sitting on the corporate network for months. Too many people still have their head in the sand. You may be a $1 million or $2 million corporation and think that your business is too small to target. But everyone is a target now. You really need to have the best defenses as possible.

OPAQ: We’ve all heard enough about Russia and the elections, yet not quite enough about why these attacks happened, and what government or political organizations can do to ensure they never happen again?

DS: Russia began with Estonia, and then they moved on to the country of Georgia, and later they hit German destinations and, of course, the United States. Estonia, even, is pretty sophisticated when it comes to digital policies and protections. The problem is that people are not doing a great job of examining what data is leaving their networks. It used to be that everyone was focused on what was coming into their networks, but the real issue is what is leaving. I can grab a database and move it offsite very quickly into a Dropbox account and no one’s the wiser. People aren’t scrutinizing the right side of the equation. You need some kind of intrusion detection system that works in both directions and looks at what is entering and leaving networks and can distinguish between ordinary and abnormal activity.

OPAQ: In recent news related to the WannaCry ransomware outbreak, Marcus Hutchins was arrested and charged with creating and distributing the Kronos banking malware. We rarely hear about the bad actors being discovered and arrested. Any thoughts on why so?

DS: First on Marcus, it’s not even clear that he is a bad guy. It’s not like an accident on the freeway where you get hit and someone sees the accident in plain sight. A lot of this stuff is not readily observable. We need tremendous cooperation between private and government researchers to track these people down. Organizations can put lures called honeypots on their network to bring in the bad actors. Yet, that might not even be legal in some cases. A private business may not have the right to prosecute because the digital fingerprint isn’t always clear. Or if the individual is from another country, they might not be able to do anything about it. Attribution is very difficult: it’s a hall of mirrors. I could try to break into GM and when they come after me, I could say no, they are hacking me! The legal system is way behind on these matters. Even with a lot of technical knowledge, I think it’s going to be really hard to prosecute Mr. Hutchins.

OPAQ: Which new advancements in enterprise security technology are interesting to you and why?

DS: New password and authentication technologies are very exciting. Passwords are still the biggest weakness in companies. We can make this much more automated with the latest single sign-on and password management products. We also need better defense mechanisms, especially on phones and tablets. A lot of people use their phones on enterprise networks. But let’s say my kid downloads an app on my phone that’s infected with malware. The next day I go to work and login to the network from my phone. Very quickly that malware can sniff out passwords across the network. Google has done a terrible job in handling malicious apps in the Play Store but it just came out with Google Play Protect, which automatically screens devices in the background for malware. The third area is ransomware-as-a-service. This will get stronger because that’s where the money is. I can have no skill whatsoever and put together a ransomware campaign with a few mouse clicks and make a lot of money. Corporations have to do a better job of making regular data backups and inspecting their network traffic to combat ransomware attacks.

OPAQ: Any thoughts on the security-as-a-service market and how it will grow in the coming years?

DS: Putting security in the cloud is definitely the wave of the future. We will see many more MSPs doing consolidation in this area to broaden their offerings. Smaller companies want to avail themselves of these services because they can’t afford to have that expertise on staff, yet they’re still going to get attacked. We are seeing threat-sharing databases get more popular. Cloud vendors can still have a proprietary take on security, but don’t need to create their own databases. These two parties will have symbiotic relationships. Over time cloud security services will be more attractive to larger companies. They are moving more of their data into the cloud so it makes sense to put security there too.

ISE® Northeast Forum Recognizes OPAQ Customer for Innovative Security Project

We’re excited to announce that one of our customers has been selected as a nominee for the 2017 ISE® Northeast Project Award. The nomination is based on the security achievements of our customer Sandy Alexander, a midsize marketing communications company that provides an array of services including CG studio services, digital printing, direct mailing, data driven marketing solutions, and retail visual merchandising.

Here’s the background on Sandy Alexander’s security project with OPAQ:

The company, which devotes 20% of its IT budget toward security, had been using a managed security service provider for branch office security management. While using a MSSP was a sensible approach to supplement Sandy Alexander’s small security staff, the benefits were not adding up. Justin Fredericks, the company’s IT director, says he was frustrated with the MSSP’s service quality and response time. He started looking for a new solution that would connect and secure its branch offices and vendors in a way that was less costly and complex, and ultimately more secure.

Sandy Alexander’s internal IT operations have dramatically improved using OPAQ’s centralized, automated security-as-a-service solution.  “My team no longer has to think about what policies are up on one site versus the other, or which IP addresses or VPN tunnels are where,” Fredericks says. “We have complete visibility and the ability to control these policies and rules across the entire environment on one dashboard.” He predicts that the company will save money, over time, compared with the MSSP.

Some details and benefits of the project include:

  •  OPAQ’s solution was layered on top of the company’s IT infrastructure, including several branch offices and sites of its vendors including data centers and manufacturing providers. The integration is deployed over redundant VPN connections. This was accomplished in one day!
  • The OPAQ 360 platform gives the IT department a central portal/dashboard to streamline policy enforcement, view status and alerts, manage threats and monitor all activity across its network.
  • The company is now obtaining complete security coverage over its IT infrastructure from one source: firewalls, intrusion and malware prevention, logging, reporting, analytics, and Distributed Denial of Service (DDOS) protection.
  • A distributed, branch office-based approach to security has now been replaced with a centralized system; that means less complexity for IT and better visibility and control over the network and all users.

 

The Information Security Project of the Year Award Program Series has been running for more than 10 years now, and winners will be announced at the ISE® Northeast Forum and Awards on October 11, 2017 in New York City.

 

David Monahan: The Perimeter is an Amoeba

ka_0011-2David Monahan is Research Director at Enterprise Management Associates (EMA). He has organized and managed both physical and information security programs, including security and network operations (SOCs and NOCs) for organizations ranging from Fortune 100 companies to local government and small public and private companies. Prior to joining EMA, David spent almost 10 years at AT&T Solutions focusing on the network security discipline. Follow David on Twitter: @SecurityMonahan.

OPAQ: Do most mid to large organizations know how many devices they have on their network—and what issues and remedies does this present for security?

DM: From our research, there is a wide gap in visibility. We estimate that organizations are lacking visibility of at least 10% to as much as 25% of their systems. There are many causes: untracked development systems, BYOD devices, rogue systems being set up under “shadow IT,” random IoT devices and other physical and virtual machines going up all the time.

To gain control of this environment, we are always looking at new niche technologies available to help identify systems and devices. Organizations are evaluating solutions from providers like NAC, established vendors like ForeScout, Cisco, HPE, as well as newcomers like Pwnie Express, and Zingbox.

The good news is there are a lot of things you can do about this problem, but companies need to go about it programmatically rather than in a knee-jerk fashion. That way you don’t end up with a bunch of disparate technologies that don’t work together. We are in a challenging time because there is no hard and fast perimeter anymore. The new elastic perimeter is like an amoeba that changes based on what is happening in that moment.

OPAQ: What are other pressing needs in network security today and are there good solutions?

DM: Knowing your assets, whether in your own data center or in the cloud, is key. Identifying what and where they are, who has access to them and when they are being accessed is really table stakes. Active breach detection is extremely important because the bad guys are still getting in. Endpoint defense is crucial area because that is where a lot of attacks are focused. A good example of that is ransomware. But ransomware can’t encrypt files that it doesn’t have access to so companies need to do a better job of controlling access to their networks and systems as well as detecting these insider threats faster. Security analytics is another area that is really useful, delivering large value to customers. These forms of analytics are used to detect an entity (user, applications, system) that begins acting differently than it has in the past or differently from other similarly classified entities are acting both historically or presently. These changes in behavior can come from insiders that have turned against the company or from external threat actors that have gained access. In this case I like to make a distinction from insider threats and threats on the inside. The former is when an employee is misusing their access to do bad things. The latter is when someone from the outside has acquired credentials and does bad things but we think we can still trust them because they are masquerading as the trusted insider. A lot of attacks today come from people leveraging credentials that they shouldn’t have.

OPAQ: What do mid-market executives say is a top barrier for delivering strong enterprise security today?

DM: The two main issues are skills and tools shortages. Budgets have not really been the main issue for the last few years. In every survey that I have done the results show that on average security budgets are increasing by 12 to 16% yearly. Aside from shortages, there are also issues with what I refer to as political, tools and data silos restricting organization’s ability to gain full visibility and context for events. Tools silos arise from having different groups in the business buying tools and they are not really working together. Data silos are created from gaps in data for some reason, which might be from misconfigured or poorly configured tools not gathering or retaining the data properly. Some examples would be only retaining server logs for 10 days, so I cannot research a newly discovered breach which actually began before that. Or perhaps my logging levels are set too low so the data I need wasn’t captured or set too high, in which case I can’t find the crucial data through all of the data “noise”. Political silos are created by individuals who own data, tool, or human resources and do not openly share those resources to maintain control for some reason or another. There are so many issues that are troubling enterprises today. Often companies don’t fully understand the scope of what they are doing and its impact on security.

OPAQ: What is the biggest security technology game-changer today—or in development now?

DM: There are a number of tools which can make a big difference. The next generation endpoint security vendors are fighting the battle of defending endpoints where antivirus technologies are not doing a good job. Another one is active breach detection systems which gather data off networks. These two technologies are ideally situated to augment each other. A really interesting new space is called deception technology. There are only a handful of companies focusing on this right now. The technology places trigger artifacts on endpoints and on the network, that would not be encountered in the normal course of business so when they are touched, security is alerted for response. (Other capabilities vary by vendor.)Then there is the space of security analytics. This is categorized into predictive analytics, anomaly detection and user entity behavior analytics (UEBA). These tools have evolved out of the need to provide real-time analytics for identifying incidents and events. Another growing area is micro-segmentation which is the policy-based control of cloud resources. It is designed to control how virtual, cloud and hybrid IT systems interconnect to create secure workloads and workflows. As containers and cloud adoption advances, traditional firewalling does not work so micro-segmentation across all of these deployment strategies will be an imperative.

Michael Suby: Using Automation and Assessments to Fight New Threats

ka_0011-2

Michael Suby is VP of Research at Stratecast, a division of Frost & Sullivan. Suby oversees the business operations of Stratecast and its research direction and serves as an analyst in secure networking. Suby spent 15 years in the communications industry with AT&T and Qwest Communications in a range of managerial, financial, and operational roles.

OPAQ: What are the latest advancements in network security technologies for the enterprise?

MS: The new technologies focus on greater detection with greater speed and certainty and the ability to respond with greater speed and precision. We are also seeing detection-less preventive mechanisms which incorporate signals and pattern matching to block malware. This is important because with zero day and highly customized malware, there is an increasing chance that we will get hit by malware which hasn’t been seen before so detection technologies are not as effective. Finally, we are seeing more interest in isolation techniques. If we can isolate the endpoint or use virtualized containers on the device inside a web session, we can prevent the propagation of malware to other devices.

OPAQ: Have advancements in security automation been helpful to the security industry and if so how?

MS: Security vendors are improving automation in their products so users can manage the security systems more efficiently. Incident detection response tools and event management tools are some of the systems which commonly embrace automation today. Automation is important in security because of the increasing security technology sprawl. There are more apparatuses to manage, greater IT footprint in terms of devices, hardware, networks, Internet of Things. Meanwhile the number of security professionals needed far outstrips supply and the cost of that talent keeps going up. So automation helps orchestrate work across the various technologies and can reduce the amount of mundane activities. This enables network security people to take their talent and apply it to the highest priorities first. Automation is also important because it helps speed up the time to detect and respond to events.

OPAQ: What should heads of security consider when making plans for their budgets in the coming 12 months?

MS: With all of the recent publicity around ransomware, we know that the current technologies won’t always work as needed. Many companies have a lot of gaps and vulnerabilities which are not being addressed and which create opportunities for ransomware writers to exploit. Yet budget planning is not just about buying new technology but increasing the frequency of objective security assessments of the enterprise. It’s often better to engage a third party to do this: they have no allegiances plus they have the knowledge base from companies in other industries. I would advise that companies spend more budget on understanding their risk position, and then taking steps forward.

OPAQ: How does this planning change if a company is going to significantly increase its cloud investments?

MS: Growing your cloud presence is common but it doesn’t happen in a vacuum. If you are doing this, at the same time you’re also decreasing the company’s presence in private data centers and managed hosting arrangements. What organizations really need to do is look at the tools and skills for managing hybrid IT. The goal is to manage workloads and control risk across all of the IT environments, and without elevating hours spent by finite IT and security resources. Of course, the cloud is not just about saving money but being more responsive to market needs. But if you are not keeping an eye on the operational aspects, it’s likely that the cost efficiencies you’re hoping to gain will be offset by the need to bring more humans into the equation.