Insight

  • In part 2 of our SOAPA video, I welcome back my astute colleague, Dave Gruber. The conversation turns to XDR, a market segment that Dave and I collaborate on. I ask Dave about:

    • The definition of XDR. It’s a nebulous industry term but Dave nails it by explaining that XDR is a method for bringing controls together to improve security telemetry collection, correlation, contextualization, and analytics. There’s also an operational side of XDR to help coordinate response and remediation across multiple controls simultaneously.
    • Whether XDR is a product or an architecture. When Dave and I first put our heads together on XDR, we realized that it looks a heck of a lot like SOAPA. Since XDR is often presented as an integrated suite from a single vendor, it’s kind of a product. Alternatively, some vendors offer open APIs and a partner ecosystem, so it’s kind of an architecture as well. Regardless, it’s still definitely SOAPA!
    • Where XDR is today. Dave admits that it’s early on for XDR and current versions start with common data collection and correlation, acting as a data lake for security analysts. Many vendors are adding advanced analytics as well. The goal is to detect “low and slow” attacks that compromise systems, move laterally across networks, escalate privileges, and ultimately exfiltrate data. In theory, XDR can detect these campaigns as it has coordinated eyes on everything.

    My SOAPA video with Dave was going so well that I invited him back for Part 3 of our video. Unprecedented! Stay tuned.

  • 10 Security Changes Post-COVID-19

    GettyImages-1217593500Back in March, I heard from several CISOs about how COVID-19 was disrupting their cybersecurity programs and changing their priorities. A few weeks later, I connected with some CISO friends, and got an update on phase 2 of their pandemic journeys.

    While no one knows when the coronavirus impact will end, we are getting a good perspective on what the new normal will look like. Here are ten changes I anticipate (in no particular order):

    1. Work from home (WFH) becomes the default model. This one is an obvious assumption, but one we can back up with data: According to ESG research, 79% of IT executives say that their organization will be more flexible about WFH policies after the pandemic subsides. Furthermore, WFH seems to be, well, working: 78% of knowledge workers report being either more productive working from home or having no change in productivity. Between productivity gains and real estate savings, WFH is a winner — and is driving lots of changes to security investment and priorities.
    2. Any remnant of a security perimeter is now dead. When I started in security nearly 20 years ago, a group of financial services companies started an organization called the Jericho Forum, which pitched the concept of de-perimeterization. While most security professionals agreed with the idea, scaling security remained a challenge, so network perimeters remained and changed slowly over time. COVID-19 may be the final security perimeter coffin nail. To support a more distributed IT infrastructure, security controls will move wholesale to endpoints — users, devices, applications, data, etc. The good news is that cloud-based management planes will make this architecture much easier to scale and operate than in the past. What are the new perimeters? Users and devices (i.e., identities) and data.
    3. Hail to the cloud. Cloud workload migration accelerated due to COVID-19 as it was easier to administer cloud infrastructure than on-premises servers, networks, and storage devices. To keep up, CISOs must ramp up cloud security hiring, training, and skills development on their teams. It’s also clear now that the public cloud is the de facto infrastructure for network security controls, consolidating SD-WAN and security services. The same is true for security analytics with data and analytics engines moving quickly to the cloud. Finally, security management planes are heading in the same cloudy direction. CISOs will need new skills for migrating data and tools and managing cloud subscriptions.
    4. The mainstreaming of attack surface management (ASM). CISOs will need better ways to collect, process, and analyze data for cyber-risk management as users and assets become more distributed and remote. This should happen quickly since most organizations have no idea about all the connections to their network and regularly discover things like previously unknown devices, misconfigured servers, default passwords, partner connections, etc. ASM will evolve from an esoteric area to an enterprise requirement. Vendors like BitSight, Bugcrowd, CyCognito, Randori, and others will benefit from this transition.
    5. Doubling down on policy management. With everything distributed, CISOs will need to work with business managers to determine who can do what from where and really (and I mean really) tighten up their security policies with granular and dynamic rule sets. Once policies are determined, they’ll also need the CIO’s help to build an infrastructure for policy enforcement and monitoring. There is a tremendous opportunity for security technologies here — vendors that build intuitive, flexible, and scalable policy management engines will clean up.
    6. Identity management gets an overhaul. Distributed security controls and policy management must be anchored by a modern identity management infrastructure — not the organically grown patchwork we’ve kludged together over the past 20 years. To ease this migration, identity will also migrate to the cloud in a hurry. This is good news for JumpCloud, Okta, and Ping, but I believe cloud service providers like Amazon, Google, VMware, and obviously Microsoft will make a big play here as well.
    7. Cyber threat intelligence at scale. COVID-19 is a global opportunity for the cyber-underworld, leading to a wave of new scams and attacks. To counteract this trend, organizations need to be able to operationalize, analyze, and hunt for threats at an unprecedented scale. This should represent a growth opportunity for threat intelligence platforms and investigation tools like Anomali, King & Union, Palo Alto Networks, RecordedFuture, ThreatConnect, and ThreatQuotient at the high end of the market. Smaller enterprises will likely dive deeper into threat intelligence services from the likes of Cisco, FireEye, IBM, and Secureworks.
    8. AI and ML, the next generation. Security teams will need to make sense of more assets, more connections, more movement, and more threats — all at once. Business management’s push for a permanent WFH structure make this an absolute certainty, and there isn’t a security team on the planet that will be able to keep up with the new reality without help. We are currently driving up the AI/ML on-ramp, and we’ll need to get up to speed quickly. This is a wide open opportunity, but somehow, I think that companies like Devo, Google (Chronicle), IBM, Microsoft, SAS, and Splunk will play.
    9. On to serious security training. WFH and coronavirus-related scams mean the days of security awareness training as a “checkbox” exercise are over. Moving forward, I believe security aptitude will be required for most employees with compensation incentives or penalties associated with performance. Business managers will also be accountable for employee education and penalized when their team’s ignorance leads to a security breach. On the supply side, vendors will need to supplement basic compliance training with more thorough course work designed for knowledge workers.
    10. Tighter security and IT operations cooperation. Provisioning secure endpoints, cloud workloads, or network infrastructure will require security to be “baked in” rather than “bolted on.” Additionally, security policy enforcement and monitoring will need to be coordinated all over the place. In the past, security and IT operations teams had different objectives, metrics, and compensation structures. Given all the work ahead, it’s likely that organizations will measure these teams based upon common projects rather than disparate goals. This should be good news for vendors like ExtraHop, Netscout, ServiceNow, and Tanium, that have technologies and experience in both areas. Security vendors will need to improve their IT operations chops if they want to keep up.

    There’s lots of changes and lots to think about. More soon from me as I’m following the impact of COVID-19 closely.

  • The SOAPA video series has featured a series of prolific industry beacons representing leading security operations technology vendors. That will continue, but I thought I’d shake up the format a bit by inviting my colleague and friend, Dave Gruber, to participate. 

    Aside from his movie star good looks, I invited Dave to participate because he spent several years at Carbon Black in the EDR market, and EDR has become a primary component of SOAPA. Furthermore, Dave and I are co-covering a burgeoning segment called XDR, which is sort of a vendor-driven turnkey SOAPA offering. 

    In part 1 of our video, Dave and I chat about:

    • The role of EDR. Dave talks about how EDR monitors endpoint telemetry and works with SIEM and SOAR to accelerate and automate incident response. 
    • EDR integration. SOAPA is all about integration and interoperability for security operations. Dave says that EDR is often paired with network traffic analysis (NTA), cloud data, email security data, and other sources. Everything rolls up to the SOC for analysis, investigations, and remediation actions.
    • EDR adoption. Dave tells us about ESG research indicating that EDR is gaining market penetration, especially as part of new endpoint security suites.
    • EDR vs. MDR. I ask Dave what makes organizations buy and operate EDR as opposed to using a managed detection and response (MDR) solution. Dave explains that EDR has gotten easier and many customers want to “own” security analytics and operations. Nevertheless, MDR is a viable alternative or can be used to augment the security staff’s capacity and skills.

    Great stuff! Look for more from Dave and I in part 2 of our SOAPA video soon.

  • John shares some great perspective on VDI solutions and how remote work has inspired the future of work. We also discuss the pros and cons of VPN and how VDI helps overcome some of the traditional challenges of VPN. The other topic we hit on is deployment choice and how businesses may choose on-prem or off-prem VDI deployment models.

  • VeeamON Goes Digital

    GettyImages-1215184800VeeamON used to have 2000 attendees. That was in the “old” world from a few months ago. The “new normal” has made the company pivot its popular event to digital, as is happening around the industry, with more or less success and execution prowess. 

    Veeam nailed it.

    One interesting thing happened: it “democratized” the event, making it available to 20 times more people than the physical event could. So out of a “bad” thing, a good thing happened. Veeam will also hold localized versions of this in various parts of the world in the next few weeks, like Europe. However, it was pretty clear that many did not want to wait and joined the event despite the time difference. 

    (more…)

  • data-integrationData integration is hard. Over the years, of all the technologies and processes that are part of an organization’s analytics stack/lifecycle, data integration continuously has been cited as a challenge. In fact, according to recent ESG research, more than 1 in 3 (36%) organizations say data integration is one of their top challenges with data analytics processes and technologies. The data silo problem is very real, but it’s about so much more than having data in a bunch of locations and needing to consolidate. It’s becoming more about the need to merge data of different types and change rates; the need to leverage metadata to understand where the data came from, who owns it, and how it’s relevant to the business; the need to properly govern data as more folks ask for access; and the need to ensure trust in data because if there isn’t trust in the data, how can you trust the outcomes derived from it?

    Whether ETL or ELT, the underlying story is the same. At some point, you need to extract data from its source, transform it based on the destination tool and/or merging data set, and then load it into the destination tool, whether that be something like a data warehouse or data lake for analysis. While we won’t get into the pros and cons of ETL or ELT, the ETL process is still prevalent today. And this is due in part to the mature list of incumbents in the ETL space, like Oracle, IBM, SAP, SAS, Microsoft, and Informatica. These are proven vendors that have been in the market for multiple decades and continue to serve many of the largest businesses on the planet. There are also several new(ish) vendors looking to transform the data integration market. Big companies like Google (via the Alooma acquisition), Salesforce (via MuleSoft), Qlik (via Attunity acquisition), and Matillion all have growing customer bases that are embracing speed, simplicity, automation, and self-service.

    Now whichever your approach is to addressing data integration, I keep hearing the same things from customers: “Vendor X is missing a feature” or “I wish I could…” or “I can’t get buy-in to try a new solution because the technology isn’t mature” or “that sounds great, but it’s a lot of work and we’re set in our ways” or “I’m just going to keep using Vendor Y because it’s too disruptive to change.” And every time I hear these common responses, I ask the same follow-up question: what’s your ideal tool? Everyone wants to ensure the technology is secure, reliable, scalable, performant, and cost-effective, but I wanted to understand the more pointed wants based on the actual folks who are struggling with data integration challenges day in and day out.

    Without further ado, I present to you the top list of “wants” when it comes to an ideal data integration tool/product/solution/technology:

    1. Container-based architecture – Flexibility, portability, and agility are king. As organizations are transforming, becoming more data-driven, and evolving their operating environments, containers enable consistency in modern software environments as organizations embrace microservice-based application platforms.
    2. GUI and code – Embrace the diversity of personas that will want access to data. A common way I’ve seen organizations look at this is that (generally speaking) the GUI is for the generalists and the code behind is for the experts/tinkerers. By the way, this mentality is evolving as modern tools are looking to help the generalists and experts alike with more automation via no-code/low-code environments and drag-and-drop workflow interfaces.
    3. Mass working sets – Common logic or semantic layers are desired. The last thing an engineer or analyst wants to do is write unique code for each individual table. It doesn’t scale and becomes a nightmare to maintain.
    4. Historic and streaming – Using batch and ad-hoc on historic and streaming data will ensure relevant outcomes. Organizations increasingly want hooks to better meet the real-time needs of the business and that means real-time availability and access to relevant data without having to jump through hoops.
    5. Source control with branching and merging – Code changes over time. Ensure source control is in place to understand how and why code has changed. Going hand in hand with source control is the ability to support branching and/or merging of code to address new use cases, new data sources, or new APIs.
    6. Automatic operationalization – This is focused on the DevOps groups. Ensure new workflows can easily go from source control to dev/test or production. Deployment is the first priority, but do not lose sight of management and the iterative nature of data integration processes as users, third-party applications, and data changes/evolves. 
    7. Third-party integrations and APIs – The analytics space is massive and fragmented. The more integrations with processing engines, BI platforms, visualization tools, etc., the better. And ensure the future of the business is covered, too. That means incorporating more advanced technology that feeds data science teams, like AI and ML platforms and services.

    While this list is by no means complete or all encompassing, it speaks to where the market is headed. Take it from the data engineers and data architects: they’re still primarily ETLing and ELTing their lives away, but they want change and recognize there are opportunities for vast improvement. And marginal improvements without massive disruption is the preferred approach. So a note for the vendors: it’s about meeting customers where they are today and minimizing risk as they continue on their data transformation journeys.

  • Dell takes OneFS to the Next Level with PowerScale

    GettyImages-1143268590This week, Dell Technologies announced the release of its new PowerScale storage platform for unstructured (file and object) data. Despite the new name, PowerScale leverages the same OneFS technology that was the backbone of the Isilon brand for so many years.

    Why isn’t it just called Isilon? Multiple reasons, Dell looks to be consolidating the naming for its infrastructure portfolio all under “Power” brands, such as PowerMax, PowerStore, PowerOne, and now PowerScale. Also, there is a lot more in this launch than just a new appliance.

    (more…)

  • It’s always a pleasure to speak with Carisa as she shares her enthusiasm for the market and the variety of experiences Citrix customers are working through today. We also touch on the potential importance of user experience monitoring and how it may play into future remote work decisions.

  • The COVID-19 crisis is creating a new normal for IT executives and knowledge workers alike. Citywide closures, stay-at-home mandates, and social distancing measures are forcing many people to work from home to help flatten the curve of a global pandemic. For IT executives, meeting the needs of these remote workers calls for new technologies, stringent protection measures, and operational agility. Knowledge workers, on the other hand, must take advantage of collaboration platforms while minimizing distractions and overcoming technical shortcomings.

    Despite these challenges and persistent health and safety concerns, COVID-19 presents a prime opportunity to redesign the workplace, rethink how work gets done, and get a head start on the future.

    ESG conducted an in-depth survey of 500 North American senior IT decision makers and 1,008 corporate knowledge workers now working at home. Survey participants represented midmarket (100 to 999 employees) and enterprise-class (1,000 employees or more) organizations in North America (United States and Canada).

    (more…)

  • Listen to hear how Citrix is helping customers and the unique position that Citrix holds in the market with its ability to deliver a variety of work-from-home solutions. Vishal also shares how important the cloud consumption model has been during these times and some potential maneuvers he sees as customers think through what’s next.

  • Brad shares some great perspective from a customer point of view and what he is seeing with Workspot customers during the COVID-19 pandemic. We chat through the varying degrees of success businesses are having, the work at home kit, and the different measures businesses may take to rebound back to a mix of remote work and office work.

    Listen in to learn more about BYOH (Bring your own home) and Double DaaS:

  • Software-defined storage promised freedom from hardware lock-in. It has delivered, in part, on that promise. However, procuring enterprise storage technology as software separate from hardware creates new complexities. When IT organizations are aiming to add infrastructure agility and flexibility, the architecture becomes more important than the delivery model. Solutions featuring a software-defined architecture will provide the greatest value, whether they are deployed and delivered as software, as an array, as hyperconverged infrastructure, or as something else.

    (more…)