Posts

Drawing a New Map of Enterprise Networking

Earlier this year I got to hear Tim O’Reilly speak at Grand Central Tech as part of their Authors @ GCT lecture series. Mr. O’Reilly is out promoting his new book, “WTF? What’s the Future and Why It’s Up To Us.” One of themes of his book is the process of innovation – how we go about creating technologies that completely change the way that we think, work, and live.

O’Reilly writes about drawing visual maps of the different elements within a company’s business plan, in order to understand how they interrelate with each other, a process that he learned about from a strategic consulting firm called BEAM. He then proceeds to draw such a map for an on-demand transportation company like Uber or Lyft.

There was a particular way that on-demand transportation worked a decade ago – you called a cab company, and a dispatcher announced your location on a radio network, and hopefully one of the cab drivers agreed to pick you up. Over time a particular set of technologies have become available, including the Internet, smart phones, and dispatching algorithms, that have enabled a completely different way of organizing this process. However, the new map for on-demand transportation didn’t draw itself – it was the job of innovators to realize that an opportunity existed to connect each of these ingredients in a new way, and to persuade the public that this new way is, in fact, a better way.

Of course, this got me thinking about what we’re doing at OPAQ Networks. IT organizations have been building enterprise networks in the same way ever since we started connecting businesses to the Internet in the early 1990’s. I usually credit Steven Bellovin and William Cheswick for drawing the original maps of this territory in their book “Firewalls and Internet Security.” This model is often called the “perimeter security model” – “We’ve got a bunch of sensitive computer systems here in our corporate headquarters, so we connected all of our satellite offices into that headquarters and we’ve built a stack of security solutions there to protect everything.”

Over time that model has started to show signs of strain. The sensitive systems that used to collect at headquarters are gone – they’ve moved into the cloud. However, the security stack is still there, and all kinds of traffic is still getting backhauled through headquarters for the sole purpose of sending it through the stack. Despite this approach, attackers are successfully getting inside by infecting end user workstations. Once their malware is running on the other side of the firewall, they have free range over the internal network and can get right to the data they want to steal.

At OPAQ Networks we are building a new map for this territory. First, we’re moving the security stack into the cloud, where the sensitive assets now live. This solves the backhaul problem, because satellite offices and remote VPN users can connect to cloud assets through our network instead of backhauling through a corporate headquarters. OPAQ has a nationwide network of points of presence and more than 200 peering relationships with major service providers that enable us to get traffic to it’s destination as efficiently and reliably as possible. Most small and medium sized enterprises don’t have the means to build this kind of infrastructure for themselves.

Second, we’re introducing software-defined network segmentation, a completely new technology that provides enterprises with unparalleled visibility and control over their internal networks. Using this tool, it’s possible to granularly segment internal networks so that end users only have access to the resources that they need, without having to reconfigure VLANs or wrestle with NAC solutions. Our partners’ midsize customers are able to adopt a better security posture, so that a single endpoint compromise does not imperil their entire business.

We are entering a time when the traditional way of building enterprise networks is being disrupted, and other maps are being drawn. Google’s BeyondCorp is one such map, along with the idea of Zero Trust Networks that was eloquently detailed in a recent O’Reilly publication. These approaches suggest doing away with the VPN and the security stack entirely, placing internal applications directly on the Internet and connecting users to them through authenticating proxy servers.

While I believe the BeyondCorp approach has merit, and there is a great deal that we can learn from it, it’s also very difficult for small and medium sized businesses to adopt. The traditional security stack delivered from the cloud has value, particularly for businesses where consistent patch and configuration management can be a challenge. The VPN has value, because it draws a clear line between the organization’s assets and the outside world. The problem is that these assets are often hosted in the wrong place today, and better segmentation is needed behind them.

This is what we’re doing at OPAQ Networks – we’re drawing a new map for the practice of enterprise networking in the cloud computing era. By leveraging network security-as-a-service, software-define network segmentation, and a modern, global network infrastructure, we’re enabling our customers to build networks that are more efficient, reliable, and secure than they have ever been before.

Simplified Microsegmentation — From the Cloud

It is time to change the way that organizations approach network segmentation. In the past few years we have seen a mounting collection of threats target the wide open nature of most organizations’ internal computer networks. Although security pros have been harping on this for some time, most networks remain crunchy on the outside and chewy in the middle – once attackers get past the perimeter, they often have access to any and everything inside the organization.

We’ve seen repeated threats recently exploit this exposure. We’ve seen incidents where entire organizations are crippled from ransomware spreading internally within their networks. We’ve seen the return of internet worms like WannaCry and NotPetya. We’ve seen more automated attacks that pivot from an initial point of compromise within a Windows network to Domain Admin access. In fact, experts are predicting significant increases in the volume of these attacks because of developments in attack automation.

Almost every organization needs to improve their network segmentation strategy in their internal network to cut down on these threats. What is preventing organizations from taking action?

Traditional Network Segmentation is Complex and Difficult to Manage

Unfortunately, the traditional approach to implementing network segmentation poses significant challenges. Configuring and managing internal firewalls and VLANs is both labor intensive and relatively inflexible. Network architecture is usually driven by the need to provide connectivity rather than security. Organizing machines with different security requirements onto separate VLANs is complex, and as soon as the work is done, users demand changes. Deploying multi-factor authentication for internal applications and services can also be a daunting project as each application must be separately integrated.

It’s no wonder organizations — particularly midsize enterprises — continue to struggle with implementing a smart, sustainable network segmentation strategy. What are midsize enterprises — and the service providers supporting them — supposed to do?

Zero Trust Software-Defined Network Segmentation from the Cloud

The term “microsegmentation” has recently become a buzzword in the IT world. These solutions provide a manageable way to lock down east/west traffic policies for cloud workloads. However, many of the threats we’re seeing – ransomware, worms, and domain lateralization – target end user workstations instead. What organizations need is a technology that provides easy-to-deploy software-defined microsegmentation capability that is flexible enough to support the entire enterprise network.

Since the acquisition of Drawbridge Networks in May 2017, we have embarked on integrating unique intellectual property into the OPAQ Cloud that allows users to manage software-defined microsegmentation for the entire enterprise, from a single pane of glass. The OPAQ PathProtect™ capability dramatically simplifies network segmentation, enhances network visibility and control, and enforces policy locally at each device, whether it’s a cloud workload or an employee laptop.

OPAQ PathProtect™ works by connecting software agents running on endpoints with a central controller hosted in the OPAQ Cloud. This architecture provides visibility and control from the cloud into every network interaction happening on every endpoint. This capability gives you the power to investigate incidents, protect against insider and external attacks, and prevent certain devices, such as compromised endpoints, from talking to other workstations on the network.

Microsegmentation with OPAQ PathProtect™ can be used to define granular access segments for users that operate independently from the network’s hardware and physical topology. It also can be easily updated when business needs change. Segments can be defined based on user identity, group membership and job function, and they will follow users as their laptops move throughout the network. OPAQ PathProtect™ can be used to enforce multi-factor authentication for access to any resource or service on the network, without any need to integrate with individual applications. This is possible because the central controller oversees all communication within the network and can authenticate users before allowing traffic to flow.

These capabilities allow organizations to adopt a security posture that is more aligned with Zero Trust security principles, in which users only have access to the specific applications required by their job function. Cutting down on unnecessary access closes the avenues that malware and network attackers use to spread laterally within an organization.

Microsegmentation for Endpoints, Not Just Data Centers

OPAQ PathProtect™ is a microsegmentation solution that can protect the whole network, including workstations, servers, datacenters, and cloud workloads, supporting the following capabilities and use cases:

  • Network Visibility provides detailed topological views of the interactions between hosts on the internal network. It is possible to drill down into different timeframes, hosts, users, process names, ports, and protocols for complete insight into network activity.
  • Network Access Control (NAC) to assign which resources, hosts and users can access services on the network. For example, unmanaged hosts can be prevented from accessing sensitive servers, and are identified and cataloged when they send traffic.
  • Multi -Factor Authentication (MFA) integration enables step-up authentication to tighten security for VPN access and within the internal network.
  • Granular Segmentation which is completely separate from the physical network architecture or network addressing, can be used to segment specific devices, applications, and data, and can keep track of hosts as they move around the network.
  • Quarantine allows organizations to quickly isolate infected hosts from sensitive resources at the touch of a button.

To find out more, view the press announcement, sign up for our upcoming webcast and schedule a demo to see how simple microsegmentation can be from the cloud.

Game-Changer: What OPAQ’s Selection of Palo Alto Networks Really Means

We’re thrilled to have announced our partnership with Palo Alto Networks, which opens up tremendous opportunities for our MSP, MSSP, and VAR partners to deliver enterprise-grade security-as-a-service from the OPAQ Cloud.

This is a huge deal. This agreement furthers OPAQ’s mission to provide fully integrated networking and enterprise-grade security as a simple, cloud-based service. It means that OPAQ partners are empowered with:

  • A subscription model designed to make enterprise-grade security affordable and accessible to midsize enterprises. The traditional approach to security has put enterprise-grade security that midsize enterprises need out of their reach because it’s too costly and complex to manage. The OPAQ Cloud is a game changer – it makes enterprise-grade security accessible and affordable to midsize enterprises. This means new, lucrative revenue opportunities for partners.
  • Fortune 100-grade network security that’s known and trusted. The OPAQ Cloud integrates best-of-breed security capabilities that are powered by known, trusted security technologies, such as Palo Alto Networks, and other industry leaders and unique OPAQ intellectual property.
  • Cloud network engineered for speed, strength, and flexibility. OPAQ owns and operates its own private network backbone. In addition to integrating best-of-breed security capabilities into the fabric of the platform, OPAQ optimizes the speed and performance of network traffic by leveraging transit and peering relationships with world-class providers.
  • Single interface designed for simplified management, compliance, and reporting. The OPAQ 360 portal provides a single pane of glass where all customer security policies and network traffic can be centrally managed and enforced — all without the cost and complexity associated with managing dozens of security products from multiple vendors.

We chose Palo Alto Networks because they are a proven technology leader in next-generation security technologies. Bringing Palo Alto Networks into the OPAQ Cloud makes enterprise-grade network security much more accessible for midsize enterprises and manageable for solution providers supporting midsize enterprises.

For more information on OPAQ’s partnership with Palo Alto Networks, read the press release here.

OPAQ CTO Tom Cross Writes on Lateralization Attacks in First Article on CSO Online

Lateralization attacks are commonly used in most sophisticated breaches today. An adversary will typically gain a foothold inside the victim’s network by installing malware on a vulnerable device.

From there, the attacker will compromise other computers within the organization by moving laterally throughout the compromised network. A number of experts are predicting an increase this year in Windows Domain lateralization attacks. Organizations are increasingly looking for a solution that can prevent and isolate lateralization attacks from spreading in their network.

OPAQ chief technology officer Tom Cross was recently invited to be a regular contributor to CSOonline, one of our industry’s most respected publications. In Tom’s first article, he discusses lateralization attacks against Windows networks, and how to defend against them. You can read the full article here.

OPAQ Shortlisted for Best Emerging Technology in 2018 SC Awards

We received some exciting news last week. The OPAQ Cloud was named an Excellence Award finalist in the Best Emerging Technology category for the 2018 SC Awards, an annual competition that recognizes the top solutions in the cybersecurity industry.

This is the second major accolade our technology has received in the past two weeks. The OPAQ Cloud was recently named best (Platinum) Network Security solution in the 2017 GSN Homeland Security Awards for cybersecurity excellence.

Making this list is very gratifying. The SC Awards are widely regarded as the gold standard in cybersecurity. The winners will be announced at the SC Awards ceremony on April 17 in San Francisco, in conjunction with the RSA Conference, the industry’s largest gathering.

According to Illena Armstrong, VP, Editorial for SC Media, “OPAQ Networks has demonstrated unique innovation in its approach to protecting companies from the onslaught of malicious attacks and other threats. Their solution represents some of the most effective security technology on the market today.”

The OPAQ Cloud is a security-as-a-service platform that integrates a private network backbone with built-in enterprise-grade security capabilities from the world’s leading technology providers and our own intellectual property.

Our vision was to create a solution that makes advanced cybersecurity protection accessible to midsize companies that lack the resources and staff to knit together and manage multiple products themselves. With the OPAQ Cloud there’s no hardware or software to buy, install and manage.

Since many midsize companies lack in-house security expertise, the OPAQ Cloud is available from managed service providers who can remotely monitor and protect their networks.

You can read the SC Awards press release here.

OPAQ Cloud Named Best Network Security Solution by Gov Security News

We are pleased to report that the OPAQ Cloud platform was recently named best (Platinum) Network Security/Enterprise Firewall solution in the 2017 GSN Homeland Security Awards for cybersecurity excellence.

The Awards are hosted by Government Security News (GSN) to recognize excellence and leadership in the Cyber Security and Homeland Security sectors. Winners were selected based on a combination of technological innovation, ability to address a recognized government IT security need, and flexibility to meet both current and future needs. Category winners were ranked with Platinum, Gold and Silver designations.

The OPAQ Cloud is tailored to meet the unique needs of State and Local governments, which face the same sophisticated security threats, like ransomware, as larger federal agencies, but tend to lack the resources and technical experts to adequately protect their networks.

The massive WannaCry cyberattack that infected computers in at least 150 countries several months ago is a good example. In the aftermath, many State IT officials said they often don’t have enough money to effectively fight sophisticated cyber threats. And the scale of that attack made them even more concerned.

Doug Robinson, executive director of the National Association of State Chief Information Officers (NASCIO) went on the record to say: “This is a big wake-up call because it is cyber disruption. States and local government need to address this because it’s a serious threat. We have urged states to take action immediately.”

There are many security products that try to do some really great things for state and local governments. However, many products and management systems are isolated and do not talk to each other.

This is why automation and orchestration are becoming a game-changing necessity for state and local governments. Leveraging automation can help state and local governments effectively detect and respond to threats at speed. This is what the OPAQ Cloud is designed to do — and it’s why we were honored with the GSN Homeland Security Award.

To find out more about the GSN Homeland Security Award, see the announcement. To learn about the OPAQ Cloud and the benefits of security-as-a-service visit https://www.opaqnetworks.com/solution.

Why we Pivoted to a 100 Percent Channel Sales Model

Today we announced the OPAQ Channel Partner Program and the completion of our transition to an indirect sales model. There are a number of reasons for this change.

First, many midsize enterprises look to service providers to deliver security services. These organizations struggle to protect themselves from cyber threats due to the shortage and high-cost of skilled IT professionals, the growing sophistication of attacks, and the complexity of managing multiple security products and services. These challenges spiked demand by midsize enterprises to outsource their security. According to Gartner, Inc., services will make up over half of all security spending, at $57.7bn in 2018. Meanwhile, spending on security outsourcing services will total $18.5bn, an 11 percent increase from 2017.

Second, both midsize enterprises and service providers struggle with the upfront expense and complexity of acquiring, configuring and maintaining multiple hardware and software security products from different vendors.

For many midsize enterprises, the capital cost of implementing a Fortune 500-grade security infrastructure, not including the human resources to manage it, is overwhelming. Meanwhile, service providers that want to offer managed security services face a similar dilemma, only from a scalability and profit margin standpoint. The traditional hardware/software model requires they purchase products, install them at the customer site(s) and then manage the infrastructure.

Many of the partners’ midsize enterprise customers require complete outsourcing while others prefer a co-managed or self-managed approach. And our partners know which model best suits the customer. We have invested significant time and resources in the development of our “single pane of glass” approach.

This enables partners to deliver end-to-end network security across their customers’ distributed infrastructures — including data centers, branch offices, mobile and remote workers, and IoT devices. The OPAQ 360 portal, a web-based interface, enables our partners to centrally provision, configure and manage an unlimited number of customer sites and policies remotely. Our Partner Portal also makes it simple for partners to go to a single place in order to access training, sales support, deal registration, and other resources that are essential in helping them to accelerate time-to-value.

According to one of our channel partners, Tom Turkot, vice president of client solutions for Arlington Computer Products, “The OPAQ Cloud is a game changer.”

You can read today’s announcement here: OPAQ Channel Partner Program Press Release. Or for information about the OPAQ Channel Partner Program visit: https://opaqnetworks.com/partner-program.

KPI v. KRI v. KCI: Key Cyber Security Indicators

Companies that have spent significant resources and money on managing their cyber security environment understandably want to know the results of all this expenditure. As such, it is important for Managed Security Service Providers (MSSPs) to be able to provide customers with some visibility into those results. However, results only tell you half the story. For instance, they may demonstrate that there was a breach, but, without significant forensic effort, will not necessarily provide the sequence of events or failures which led up to the compromise.

Organizations are complex and have many performance measures. Most have designated key performance indicators (KPIs) at various levels of the organization, which business management agrees are the most important metrics to monitor. They are designed to be leading indicators of business performance. Key risk indicators (KRIs) are similar in that they are leading indicators; however, rather than signal performance, they signal increased probability of events that have a negative impact on business performance. Then there are key control indicators (KCIs), that are closely related to KRIs in that they measure the effectiveness of risk controls.

Business managers use KPIs to show where things are going well or poorly and KRIs to indicate when the probability of the latter is increasing. KCIs are a measure of how well risks controls are performing. MSSPs can, and should, do the same using security data which is commonly available for most of their clients.

More on KPIs, KRIs and KCIs

You may hear these terms used interchangeably; however, they are distinctive and should be treated differently in order to make them understandable.

  • Key performance indicator (KPI): Shows how the business is performing based on the goals and objectives leadership has set as well as the progress that is being made toward those goals. For security operations, this metric might be used in an effort to resolve open items or tackle a backlog of unresolved security investigations.
  • Key risk indicator (KRI): Measures the company’s level of risk, and how its risk profile changes over time. An example for security operations is to use metrics that measure the severity of threats and vulnerabilities being reported by sensors. Another example is  to look for places in the security defensive chain events that are happening (e.g. end-point-based events are more “risky” than firewall or WAF events). Finally, make sure you have a good understanding as to business-role the assets involved play. Security events that occur on critical assets present more risk than those on noncritical ones.
  • Key control indicator (KCI): Indicates how much control a company has over its environment and its level of risk, or how effectively a particular control is working. Putting this in context with IT security operations, a question to ask is whether you have the necessary controls across all areas of the business – for example, the NIST Cyber Security Framework functional areas (identify, protect, detect, respond and recover). Knowing that these functions have sufficient coverage throughout your defense in depth (devices, applications, networks, data and users) gives you a degree of confidence in your controls.

How to Use These Metrics

The interplay between the performance, risk and control metrics is the key feedback that an organization needs in order to be confident that investments in cyber security are appropriate. Now that we have defined the appropriate use for the individual metrics, let’s see some examples of how to apply them:

  • Risk is the probability of bad things happening applied to the business cost of it happening. You can calculate an estimate of the probability by looking at the number, place (where in the defense in depth model) and severity of events measured by sensors. For the impact, or real cost, look at which hosts are involved. Are they where the crown jewels are kept, or more of an extra store-room full of old furniture? Faced with so much data, organizations can be afflicted with “analysis paralysis,” so simplify these measures into risk metrics everyone can understand.
  • Performance metrics are meant to show how efficient an organization is at accomplishing its mission. In cyber security, the mission happens to be risk mitigation. So performance is how well you manage your backlog of open security cases, time to resolution, etc. with respect to the staff and systems you have. There are significant parallels to customer support metrics in this category.
  • Controls mitigate risks and enable performance. In cyber security, technical (security sensors) and process controls are your bread and butter. They also generate the data that drive risk metrics and allow you to optimize performance. Compliance measures are your friend here. Measure your degree of coverage against a framework such as NIST CSF.

Generating the metrics here seems like a daunting task at first. But, once you start simplifying and categorizing the measures, you will find that you can come to a reasonable set quickly. Then you need to automate their calculation. With experience, you will learn whether you’ve chosen the right KPIs and KRIs, and you can make adjustments as necessary. Getting started can be a challenge for MSSPs, but it’s 80 percent of the battle.

The most important thing to remember is that the statistics coming out of your cyber security systems are not KPIs, KRIs or KCIs. They are just data. Decide what risk performance, risk or control measures you need in order to clearly explain metrics of security operations to the business you support.

Test these on business managers to make sure they resonate, adjust and go again. The more consistent and transparent your measures, the more confidence your clients will have in their security investments.

Putting KPIs, KRIs and KCIs into Practice

On one hand, you have a large amount of security data – the proverbial big data problem. On the other hand, you need actionable output – a list of what to do now to transform your clients’ security programs into a high performance business driver. Metrics will guide your path to success, but generating consistent and reliable information security metrics is hard. So here are a few steps to get you started.

Step 1: Understand your Coverage, Operations, and Compliance Challenges

Security operations involves a set of functions being performed across a set of assets. The NIST Cyber Security Framework (CSF) provides a core list of the functions and the Cyber Defense Matrix from OWASP does a fine job of aligning those functions against a representative set of assets. Categorizing the deployed security products or processes in your client’s environment within the matrix will establish coverage and identify gaps in the program’s architecture.

Operationalizing the matrix by collecting, identifying and assigning the output data from your security products to each cell in the matrix shows evidence of operations and serves as your first step in addressing the ‘big security data’ problem.  Gaps between what you thought you had deployed and what actually shows up as evidence of operations will provide you with an immediate ‘to-do list’.  

Applying a control framework (such as CIS Top 20, GDPR, or FFIEC) adds depth to each of the intersections by mapping specific security controls to both deployed security products and your client’s assets.  The resultant overlay identifies gaps in your compliance effort and your second ‘to-do list’.  When combined with your operational to-dos, the entire list can be mapped to a 30, 60, 90-day plan of action with key milestones. Wash, rinse and repeat for each of your lines of business or departments, and you now have a path for your journey.

Step 2: Measure your Efficacy 

With security products and processes deployed and more on the way as you move down your path, it is time to measure the effectiveness of each action and ensure its alignment with the business.  Recall that operationalizing the matrix served as the first step in solving the big data challenge by categorizing the data and applying business context through the assets in the matrix and each line of business or department.

Enriched with this context, the security data can now be normalized and analyzed to produce key metrics, or as we called them earlier, KPIs, KRIs and KCIs. Examples include the speed of new threats or vulnerabilities for KRIs, the treatment of symptoms or root causes for KPIs, or the reduction defensive workload for KCIs.

With metrics in place, each to-do on your journey can be seen as a resultant change in one or more metrics. What’s more, the value of fixing operational to-dos or implementing a specific control can be measured and communicated specific to the business context it affects. At each milestone on the journey, thresholds for metrics can be set to determine success or identify needed adjustments in the plan.

Final Thought

It’s all about the journey. A successful information security program is not an end-state, but a continually monitored and adjusted compilation of people, process and technologies. Mapping the program’s functions with your client’s assets and required controls provides you the steps needed to mature your program while metrics will keep you honest about how well the program is performing.

Closing the security skills gap with online education

ka_0011-2Ryan Corey is President and Co-founder of Cybrary, Inc., an online security training and education provider. Cybrary provides free access to security courses, along with learning tools and an enterprise training product.

 OPAQ: Describe business needs for security training today and how/why online courses are a good fit for meeting them?

RC: Technologies are shifting so fast, and attack surfaces are expanding so fast that it is tough to keep up with it all. Equipping personnel with the right skills is critical. Research has shown that companies retain their people when they continually train them, but the tech and IT security training landscape is problematic. The traditional model is to send people on a one-week course, where they cram in lots of material at a cost of $3,000 to $6,000. The industry would certify people in whatever course they took, and that’s another $300 to $1,000 for the test. It’s also inaccessible: if you are not close to a major metropolitan area then you have to travel. And if a company doesn’t sell enough seats in a course, then the course might cancel. That’s inconvenient. It became obvious when companies like Pluralsight and Linda started having massive success that online became the preferred way to do training. You can do it at your own pace, and it’s much more affordable. Microbursts of learning are what seems to work best for most people.

OPAQ: From looking at your user base and most popular courses, what trends do you see that correlate to security education and/or security needs?

RC:  Concepts like the DOD 8140 directive for federal government and pen testing are popular with consumers and on the enterprise side, incident response and threat intelligence. The enterprise product, which is the paid side of the business and includes full access to all the learning tools, is seeing 20-30% revenue growth monthly. Yet we also know that so many security teams are not getting training, and it’s surprising.

OPAQ: In what cases are online learning not an appropriate match for security professionals?

RC: People tend to go to classrooms when there is pressure to learn something in a specific time period, when they need mentorship, or hands-on training. I think where online falls short is in the accountability aspect, but you can design courses with gamified concepts to help keep people engaged. It’s like going to the gym on a regular basis. Sometimes you don’t see what the reward is going to be, so maybe you won’t go.

OPAQ: Aside from training and education, what else is critical to closing the security skills gap in this country?

RC:  The final piece is assessment. Let’s say a stay-at-home mom who used to work in IT wants to go back to work after being at home for five years. She’d like to work in cyber but she’s got no experience. So even if she goes and takes a $5000 course for a week, that’s still not enough, and getting a two-year degree is really not convenient and it’s expensive. That is very high friction. That’s the same for someone just starting out.  A degree is not useful without experience. If an individual takes online training and does an assessment, that puts them through real world scenarios and gives scores for their performance. There is a company called Cyberscore that offers tech assessments for system administrators. Coding challenges are another way to do this. The point is, people need a transparent way to show that they are technically proficient in a security skill to the employer.

Good Security Depends Upon Automation, Analytics and Outsourcing

Joshua Margolin is Principal Analyst at Clutch. He received his BA in Business Communications from the University of New Hampshire, and his MA in Technology & Entrepreneurship from Georgetown University.

OPAQ: Which are the hottest areas within the security tech sector right now in terms of customer demand and innovation?

JM: To set the stage, companies worry most about whether they will be too late in implementing security technology. Another important consideration is the job market, because there isn’t enough cyber security talent to go around. Companies don’t know where they stand from a risk profile standpoint and once they do, many aren’t sure how to address it. There’s going to be less of a demand for security consultants and analysts because more companies will defer to automation solutions for detection, monitoring, privileged access and transparency. The fact that you can subscribe to security services in the cloud means that you don’t need to hire a team of experienced analysts. Our recent survey indicated that 70% of large companies will invest more in cybersecurity technology over the next year.

Another top category is Internet of Things (IoT). Large enterprises have a lot to gain by integrating IoT into their core business. On the consumer side, we are seeing more of these devices all the time – from smart home and car technology to wearables. Companies need to determine whether or not they should invest money in endpoint protections considered outside the traditional realms of interaction.

OPAQ: What types of customers are becoming more interested in cloud or outsourced security services and how do you think this market will evolve?

JM: It makes sense to outsource these activities, especially for smaller companies because it’s so expensive to staff your own team of security experts. Yet before you spend money with any vendor, it’s worth the investment to hire a threat intelligence agency. These companies audit internal data and practices while considering the wider marketplace, all in an effort to determine what threats would most likely be encountered. Companies easily fall into the illusion that technology is the panacea. Not every business requires the same degree of security or even the same approach. It’s also important to remember that at least half of a company’s needs can be addressed by sound policy and effective training. For many companies, hiring a SaaS provider or two is sufficient. With larger project scopes, a MSSP is ideal because they will integrate several complementary SaaS products and manage the vendor relationships.

OPAQ: Both Gartner and IDC predicted earlier this year, 7-8% growth in IT security spending worldwide. How do companies best decide how to use a bigger budget?

JM: It will first depend on what internal expertise they have out of the gate. Any company that has a CSO or CIO has experience and networks to help figure this out. What’s difficult is when a company has no internal IT to rely on. This leaves them at the mercy of vendors’ salesmanship. They might be driven by the fear factor or they might misallocate budget to bring a contractor in-house. This only drives the costs way up. It might offer more peace of mind when compared to outsourcing but then the company is limited by the expertise of any single person. There’s a lot more to gain by tapping into wider talent pools.

OPAQ: Are developers and engineers having a hard time staying abreast of threats and developing the right solutions to counteract new threats and recover from them?

JM: The market for malware and ransomware is booming. There are a lot of talented people out there with malicious intent. These actors are often well financed by corporations or governments and they will find a way in; it’s only a matter of time. Technologists and engineers on the good side are always going to be chasing down the black hat actors. It’s better to be adaptive and react in the nick of time, all made more possible than ever thanks to advances in predictive analytics and artificial intelligence. That’s where the new frontier is for cybersecurity.