Securing digital experiences without disrupting business is more than a technology challenge: it’s a matter of survival. However, today’s apps are built on multiple architectures, distributed over various cloud environments, and connected by an exploding number of APIs. This unprecedented level of complexity requires more skill and resources to manage, and creates more points of vulnerability.
Stay up to date on the latest trends in app and API protection, secure multicloud networking, the future of application services, technology and architectures evolving from the transition to an AI-assisted future, and insights into emerging technologies and threats with articles and blogs from the Office of the CTO.
Despite changes in architectures and location, security problems do not change. Customers still need to protect applications, ensure availability, and defend against DDoS attacks. Application security is just a bigger challenge now due to the expanding attack surfaces and vulnerabilities.
There are two walls in the delivery path for applications. The first (which has been the cause of much friction) is between development and production. And while DevOps methodologies have begun to break down this wall, the other—the one between production and delivery to consumers—is not so easily breached.
In the three phases of digital transformation, the first phase is all about automation. The focus on digitizing of workflows in phase two will ultimately offer business a path forward to the third phase, where data-driven services will generate actionable insights that improve efficiency, reduce process friction, and increase both productivity and profits.
First confined to the data center, Application Performance Monitoring (APM) has become increasingly context-driven around web-based user experiences. Today, it isn't enough to identify what went wrong after the fact. Businesses need to identify where trouble might occur before it happens.
There is an ebb and flow to technology cycles. Its inevitability is extant in many theories, such as the S-curve of innovation and a certain well-known analyst firm's hype cycle. Cloud is subject to these same waves. We've seen two very strong cycles over the past ten years, and it appears that a third wave is beginning to crest.
Data comes from a variety of sources across the code-to-customer path. Applications. Platforms. Application services. Infrastructure. Devices. All are capable of emitting data that can be turned into business value by the right solution. The elephant in the room during any discussion on the use of data is, of course, privacy.
The first step in a discussion about data architecture is to define what the concept of “data architecture” encompasses. Unsurprisingly, the answer turns out to be nuanced—it is layered and multifaceted. To help ground the discussion, it is useful to start by thinking about it in terms of the journey of collected telemetry data.
Digital payments have become as common as cash used to be. Shutdowns due to the COVID-19 pandemic have only accelerated the rate consumers rely on such services. But it has also accelerated digital payments on the corporate side. After all, businesses still have accounts payable and receivable whether they're open to the public or not.
Today, F5 offers the most comprehensive application services along the code to customer path. For the future, we are doubling down on application telemetries and analytics to help our customers discover insights about their applications, business flows, and user experiences. As we build out our cloud analytics capabilities, we'll further leverage ML/AI to help our customers to improve their business services.
It should be no surprise that as there evolves a new generation of application architectures that a new generation of load balancing accompanies it. Since the birth of load balancing, just before the turn of the century, the technology has moved in a predictable pace, which means it is time for innovation.
Amid this pandemic, the systems processing unemployment claims in many U.S. states found themselves in peril. Developed using COBOL, the systems faced overwhelming demand, prompting urgent calls for those proficient in a programming language dating back to 1959. In parallel, many applications driving contemporary digital transformation efforts are microservices-based. These apps, like their COBOL forebears, are likely to be so critical that they, too, may still be running in 50 or 60 years.
The expansion of applications tied to organizations’ digital transformation efforts also increases the number of attack surfaces for bad actors to target, with the effects of the current pandemic further accelerating this growth. Coupled with users’ tendency to reuse passwords, you’ve got an unfortunately prime environment for credential stuffing attacks. This article highlights established best practices that can help both businesses and consumers protect themselves.
An insertion point is an architecturally distinct location in the code to customer data path at which it makes sense to add functionality that is often outside the purview of development or operationally more efficient. Insertion points include the client, infrastructure, and the app itself. So, what we're looking for are app services that are both operationally and cost efficient at the point of insertion; in this case, we're focused on the app server (platform) itself.
The dominance of video and SaaS traffic today is, in part, why remote access services are being overwhelmed. In parallel, the rise of telemedicine during this pandemic is increasing and along with it, live video traffic. One way to increase capacity and improve performance for remote users is to update your remote access configuration to reflect the modern makeup of application traffic.
"Data is the new oil" or "Data is the grease of the digital economy." If you're like me, you've probably heard these phrases, or perhaps even the more business school-esque phrase "data exhaust monetization," to the point of being clichés. But like all good clichés, they are grounded in a fundamental truth, or in this case, in a complementary pair of truths.
Are we just using telemetry because it sounds sexier than data? Ultimately both data and telemetry are organized bits of information. To use them interchangeably is not a crime. But the reality is that, if you want to be accurate, there is a difference. And that difference will become increasingly important as organizations march into the data economy.
Put simply, one can define artificial intelligence as “teaching a computer how to mimic aspects of human intelligence.” To understand how AI and application services will work together in the future, it’s first necessary to examine three distinct types of AI: Strong, Weak, and Assistive.
The inability to verify the integrity or correctness of data should be of significant concern to those organizations pursuing digital transformation efforts (which rely heavily on data). That data will be used not only to conduct business but it also forms the basis for pattern and behavior recognition. Accordingly, it will power advanced analytics that automatically make operational and business decisions without human intervention.
What sets Shape and F5 apart is F5’s ability to capture high fidelity data from our position in front of millions of mission-critical customer applications combined with the sophisticated AI-assisted analytics platform from Shape. By integrating Shape and F5, we are executing on our vision to create an advanced set of security capabilities that can handle today’s most sophisticated attacks.
API stands for Application Programming Interface. Over the years, it has evolved from a tightly coupled imperative specification to a loosely coupled declarative model. Regardless of implementation and the mode of invocation, APIs tend to be associated with app development. But another API economy has been steadily expanding. It lies within operations. And in that domain, the "A" in API stands for automation.
Application architectures have evolved several times since the early days of computing, and it is no longer optimal to rely solely on a single, known data path to insert application services. Furthermore, because many of the emerging data paths are not as suitable for a proxy-based platform, we must look to the other potential points of insertion possible to scale and secure modern applications.
Digital transformation is about moving technology from business interactions to processes to new models. At first, it's about apps. But as app portfolios expand, it turns to focus on automation and orchestration. With the increase in data generation, transformation becomes the pivot point for new business opportunities.
The future of security rests on telemetry that is more than technical data points picked out of packets. It requires a holistic view of interactions from client to application to behavior. Machine learning requires enormous amounts of data to establish and recognize patterns. This is why programmable proxies are such a critical part of an advanced security approach.
Organizations are taking advantage of the proliferation of digital technologies to define new business models or to improve business productivity with existing models. While the pace of digital transformation varies based on the business and the sector it is in, overall, the journey of digital transformation has three stages.
Visibility remains key for organizations to secure, scale, and speed up applications. And as apps are increasingly distributed—across containers, clouds, etc.—the more widely we need to distribute the application services that provide heightened visibility to enhance performance.
Some attacks are merely a nuisance, degrading network performance or disrupting availability. Some others may be relatively rare but have far more serious impacts in the form of data breaches. Like the plagues spread by mice that wiped out cities in a bygone era, today’s attacks wipe out brand reputations and business value.